{"docstore/data": {"b01e3aaa-a296-4d1c-bf00-9f8c0d1d55dc": {"__data__": {"id_": "b01e3aaa-a296-4d1c-bf00-9f8c0d1d55dc", "embedding": null, "metadata": {"page_label": "1", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "MHRA GxP Data Integrity Guidance and Definitions - March 2018: Ensuring Data Integrity in Good Practice Regulations", "questions_this_excerpt_can_answer": "1. What is the title of the MHRA guidance document that focuses on ensuring data integrity within Good Practice Regulations as of March 2018?\n \n2. As of the last update in March 2018, how many pages does the MHRA GxP Data Integrity Guidance and Definitions document contain?\n\n3. What specific version or revision number was assigned to the MHRA GxP Data Integrity Guidance and Definitions document released in March 2018?", "excerpt_keywords": "MHRA, GxP, Data Integrity, Guidance, Definitions"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n# mhra regulating medicines and medical devices\n\n## medicines & healthcare products regulatory agency (mhra)\n\n## gxp data integrity guidance and definitions\n\nmarch 2018\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 1 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "21d7a945-535b-49dc-8671-d9a9990671c0": {"__data__": {"id_": "21d7a945-535b-49dc-8671-d9a9990671c0", "embedding": null, "metadata": {"page_label": "2", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "MHRA GxP Data Integrity Guidance and Definitions: Ensuring Compliance and Accuracy in Regulated Environments", "questions_this_excerpt_can_answer": "1. What specific guidance does the MHRA GxP Data Integrity Guidance and Definitions document provide regarding the handling and interpretation of raw data in compliance with ICH GCP standards?\n\n2. How does the MHRA document outline the process and considerations for data transfer or migration within regulated environments to ensure data integrity and compliance?\n\n3. What are the detailed requirements and definitions provided by the MHRA for establishing computerised system user access and system administrator roles to safeguard data integrity in GxP regulated environments?", "prev_section_summary": "The section is from the MHRA GxP Data Integrity Guidance and Definitions document released in March 2018. It focuses on ensuring data integrity within Good Practice Regulations. The document is titled \"MHRA GxP Data Integrity Guidance and Definitions - March 2018: Ensuring Data Integrity in Good Practice Regulations\" and is labeled as revision 1. The section mentions the Medicines & Healthcare Products Regulatory Agency (MHRA) and emphasizes the importance of regulating medicines and medical devices. It also indicates that the document contains 21 pages in total.", "excerpt_keywords": "MHRA, GxP, Data Integrity, Compliance, Regulated Environments"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n|content|page number|\n|---|---|\n|background|3|\n|introduction|3|\n|the principles of data integrity|4|\n|establishing data criticality and inherent integrity risk|5|\n|designing systems and processes to assure data integrity; creating the right environment|7|\n|definition of terms and interpretation of requirements|8|\n|data|8|\n|raw data (synonymous with source data which is defined in ich gcp)|8|\n|metadata|9|\n|data integrity|9|\n|data governance|9|\n|data lifecycle|10|\n|recording and collection of data|10|\n|data transfer / migration|10|\n|data processing|11|\n|excluding data (not applicable to gpvp):|11|\n|original record and true copy|11|\n|original record|11|\n|true copy|12|\n|computerised system transactions:|13|\n|audit trail|13|\n|electronic signatures|14|\n|data review and approval|15|\n|computerised system user access/system administrator roles|16|\n|data retention|17|\n|archive|18|\n|backup|18|\n|file structure|19|\n|validation - for intended purpose (gmp; see also annex 11, 15)|19|\n|it suppliers and service providers (including cloud providers and virtual service/platforms (also referred to as software as a service saas/platform as a service (paas) / infrastructure as a service (iaas))|19|\n|glossary|20|\n|references|21|\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 2 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e43a0b04-d6ac-4139-baef-99825e68067b": {"__data__": {"id_": "e43a0b04-d6ac-4139-baef-99825e68067b", "embedding": null, "metadata": {"page_label": "3", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "MHRA GxP Data Integrity Guidance: Promoting a Risk-Based Approach to Data Management and Compliance", "questions_this_excerpt_can_answer": "1. What specific regulatory bodies and international guidelines does the MHRA GxP Data Integrity Guidance align with, and how does it position itself in relation to these entities and their guidelines?\n \n2. How does the MHRA GxP Data Integrity Guidance suggest organizations balance the safeguarding of data integrity with other compliance priorities, and what is the recommended approach to achieving this balance?\n\n3. What is the scope of the MHRA GxP Data Integrity Guidance in terms of its applicability to different GxP areas, and how does it address the relevance of the guidance to areas not explicitly covered by provided examples?", "prev_section_summary": "The section provides guidance on data integrity in regulated environments, focusing on principles, data criticality, system design, terms definitions, data lifecycle, data transfer, data processing, original records, computerized system transactions, data review, user access roles, data retention, validation, IT suppliers, and glossary. Key entities include MHRA, GxP standards, ICH GCP, metadata, audit trail, electronic signatures, and backup.", "excerpt_keywords": "MHRA, GxP, Data Integrity, Compliance, Risk-Based Approach"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## background\n\nthe way regulatory data is generated has continued to evolve in line with the ongoing development of supporting technologies such as the increasing use of electronic data capture, automation of systems and use of remote technologies; and the increased complexity of supply chains and ways of working, for example, via third party service providers. systems to support these ways of working can range from manual processes with paper records to the use of fully computerised systems. the main purpose of the regulatory requirements remains the same, i.e. having confidence in the quality and the integrity of the data generated (to ensure patient safety and quality of products) and being able to reconstruct activities.\n\n## introduction\n\n2.1 this document provides guidance for uk industry and public bodies regulated by the uk mhra including the good laboratory practice monitoring authority (glpma). where possible the guidance has been harmonised with other published guidance. the guidance is a uk companion document to pic/s, who, oecd (guidance and advisory documents on glp) and ema guidelines and regulations.\n\n2.2 this guidance has been developed by the mhra inspectorate and partners and has undergone public consultation. it is designed to help the user facilitate compliance through education, whilst clarifying the uk regulatory interpretation of existing requirements.\n\n2.3 users should ensure their efforts are balanced when safeguarding data from risk with their other compliance priorities.\n\n2.4 the scope of this guidance is designated as gxp in that everything contained within the guide is gxp unless stated otherwise. the lack of examples specific to a gxp does not mean it is not relevant to that gxp just that the examples given are not exhaustive. please do however note that the guidance document does not extend to medical devices.\n\n2.5 this guidance should be considered as a means of understanding the mhras position on data integrity and the minimum expectation to achieve compliance. the guidance does not describe every scenario so engagement with the mhra is encouraged where your approach is different to that described in this guidance.\n\n2.6 this guidance aims to promote a risk-based approach to data management that includes data risk, criticality and lifecycle. users of this guidance need to understand their data processes (as a lifecycle) to identify data with the greatest gxp impact. from that, the identification of the most effective and efficient risk-based control and review of the data can be determined and implemented.\n\n2.7 this guidance primarily addresses data integrity and not data quality since the controls required for integrity do not necessarily guarantee the quality of the data generated.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 3 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "16bf1a23-e3d2-4957-9f1d-f8f31d3c6e49": {"__data__": {"id_": "16bf1a23-e3d2-4957-9f1d-f8f31d3c6e49", "embedding": null, "metadata": {"page_label": "4", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Principles and Guidelines for GxP Compliance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What is the recommended approach for organizations to ensure data integrity according to the MHRA GXP Data Integrity Guidance and Definitions document, and how does it suggest managing the inherent risks associated with data formats and criticality?\n \n2. How does the MHRA GXP Data Integrity Guidance and Definitions document suggest organizations should balance the effort and resources applied to data integrity assurance relative to the risk and impact of data integrity failure on patients or the environment?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions document, how should organizations respond to identified data integrity weaknesses, and what is the guidance on the scope of implementing corrective and preventive actions?", "prev_section_summary": "The section provides background information on the evolving nature of regulatory data generation and introduces the MHRA GxP Data Integrity Guidance, which aims to promote a risk-based approach to data management and compliance. Key topics covered include the purpose of regulatory requirements, the scope of the guidance in relation to different GxP areas, the alignment with international guidelines and regulatory bodies, the importance of balancing data integrity with other compliance priorities, and the emphasis on understanding data processes and implementing risk-based controls. The document aligns with PIC/S, WHO, OECD, and EMA guidelines and regulations, and encourages engagement with the MHRA for scenarios not explicitly covered in the guidance.", "excerpt_keywords": "MHRA, GxP, Data Integrity, Guidance, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n### 2.8\n\nthis guidance should be read in conjunction with the applicable regulations and the general guidance specific to each gxp. where gxp-specific references are made within this document (e.g. ich q9), consideration of the principles of these documents may provide guidance and further information.\n\n### 2.9\n\nwhere terms have been defined, it is understood that other definitions may exist and these have been harmonised where possible and appropriate.\n\n### 3. the principles of data integrity\n\n### 3.1\n\nthe organisation needs to take responsibility for the systems used and the data they generate. the organisational culture should ensure data is complete, consistent and accurate in all its forms, i.e. paper and electronic.\n\n### 3.2\n\narrangements within an organisation with respect to people, systems and facilities should be designed, operated and, where appropriate, adapted to support a suitable working environment, i.e. creating the right environment to enable data integrity controls to be effective.\n\n### 3.3\n\nthe impact of organisational culture, the behaviour driven by performance indicators, objectives and senior management behaviour on the success of data governance measures should not be underestimated. the data governance policy (or equivalent) should be endorsed at the highest levels of the organisation.\n\n### 3.4\n\norganisations are expected to implement, design and operate a documented system that provides an acceptable state of control based on the data integrity risk with supporting rationale. an example of a suitable approach is to perform a data integrity risk assessment (dira) where the processes that produce data or where data is obtained are mapped out and each of the formats and their controls are identified and the data criticality and inherent risks documented.\n\n### 3.5\n\norganisations are not expected to implement a forensic approach to data checking on a routine basis. systems should maintain appropriate levels of control whilst wider data governance measures should ensure that periodic audits can detect opportunities for data integrity failures within the organisations systems.\n\n### 3.6\n\nthe effort and resource applied to assure the integrity of the data should be commensurate with the risk and impact of a data integrity failure to the patient or environment. collectively these arrangements fulfil the concept of data governance.\n\n### 3.7\n\norganisations should be aware that reverting from automated or computerised systems to paper-based manual systems or vice-versa will not in itself remove the need for appropriate data integrity controls.\n\n### 3.8\n\nwhere data integrity weaknesses are identified, companies should ensure that appropriate corrective and preventive actions are implemented across all relevant activities and systems and not in isolation.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 4 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "65ef36e4-4f0e-4cad-8942-0539434406bf": {"__data__": {"id_": "65ef36e4-4f0e-4cad-8942-0539434406bf", "embedding": null, "metadata": {"page_label": "5", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Regulatory Compliance through Effective Data Governance and Risk Assessment", "questions_this_excerpt_can_answer": "1. What specific attributes does the MHRA GXP Data Integrity Guidance and Definitions document attribute to the acronym ALCOA, and how does it differentiate from ALCOA+ in terms of data quality attributes for regulatory purposes?\n\n2. How does the MHRA GXP Data Integrity Guidance and Definitions document suggest handling significant data integrity incidents in terms of regulatory authority notification?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions document, what factors should be considered when assessing the criticality of data and the inherent risks to its integrity, especially in relation to the method of data generation (e.g., paper-based, electronic, hybrid, or other means)?", "prev_section_summary": "The section discusses the principles of data integrity according to the MHRA GXP Data Integrity Guidance and Definitions document. Key topics include the organization's responsibility for data integrity, the importance of creating a suitable working environment for data integrity controls, the impact of organizational culture on data governance, the need for a documented system to control data integrity risks, the importance of appropriate levels of control and periodic audits, the need for resources to be commensurate with the risk of data integrity failure, the persistence of data integrity controls regardless of system type, and the implementation of corrective and preventive actions for identified data integrity weaknesses. Key entities mentioned include the organization, data, systems, facilities, data governance policy, data integrity risk assessment, data formats, controls, data criticality, risks, patient, environment, automated systems, manual systems, corrective actions, and preventive actions.", "excerpt_keywords": "MHRA, GXP, Data Integrity, ALCOA, Regulatory Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n### 3.9 appropriate notification to regulatory authorities should be made where significant data integrity incidents have been identified.\n\n### 3.10 the guidance refers to the acronym alcoa rather than alcoa +. alcoa being attributable, legible, contemporaneous, original, and accurate and the + referring to complete, consistent, enduring, and available. alcoa was historically regarded as defining the attributes of data quality that are suitable for regulatory purposes. the + has been subsequently added to emphasise the requirements. there is no difference in expectations regardless of which acronym is used since data governance measures should ensure that data is complete, consistent, enduring and available throughout the data lifecycle.\n\n### 4. establishing data criticality and inherent integrity risk\n\n4.1 data has varying importance to quality, safety and efficacy decisions. data criticality may be determined by considering how pe data is used to influence pe decisions made.\n4.2 the risks to data are determined by pe potential to be deleted, amended or excluded wipout auporisation and pe opportunity for detection of pose activities and events. the risks to data may be increased by complex, inconsistent processes wip open-ended and subjective outcomes, compared to simple tasks pat are undertaken consistently, are well defined and have a clear objective.\n4.3 data may be generated by:\n(i) recording on paper, a paper-based record of a manual observation or of an activity or\n(ii) electronically, using equipment pat range from simple machines prough to complex highly configurable computerised systems or\n(iii) by using a hybrid system where bop paper-based and electronic records constitute pe original record or\n(iv) by oper means such as photography, imagery, chromatography plates, etc.\npaper data generated manually on paper may require independent verification if deemed necessary from pe data integrity risk assessment or by anoper requirement. consideration should be given to risk-reducing supervisory measures.\nelectronic the inherent risks to data integrity relating to equipment and computerised systems may differ depending upon pe degree to which pe system generating or using pe data can be configured, and pe potential for manipulation of data during transfer between computerised systems during pe data lifecycle. the use of available technology, suitably configured to reduce data integrity risk, should be considered.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 5 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "71d3a657-f6df-4812-ba3f-254422b331f6": {"__data__": {"id_": "71d3a657-f6df-4812-ba3f-254422b331f6", "embedding": null, "metadata": {"page_label": "6", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "MHRA Guidelines for Data Integrity in Regulating Medicines and Medical Devices: Ensuring Compliance and Quality Control", "questions_this_excerpt_can_answer": "1. What specific considerations does the MHRA guidance suggest for managing data integrity in hybrid systems used for regulating medicines and medical devices?\n \n2. How does the MHRA guidance propose handling data generated from non-traditional formats, such as photographs or imagery, in the context of data integrity for medicines and medical device regulation?\n\n3. According to the MHRA guidance, under what conditions can reduced effort and/or frequency of control measures be justified in the context of data integrity for product, patient, or environmental impact assessments?", "prev_section_summary": "The section discusses the importance of appropriate notification to regulatory authorities in case of significant data integrity incidents. It also explains the attributes of the ALCOA acronym (attributable, legible, contemporaneous, original, accurate) and the ALCOA+ acronym (complete, consistent, enduring, available) in terms of data quality for regulatory purposes. The document emphasizes the need for data governance measures to ensure data completeness, consistency, endurance, and availability throughout the data lifecycle. Additionally, it addresses the establishment of data criticality and inherent integrity risks, considering factors such as data usage in decision-making, risks to data integrity, and methods of data generation (paper-based, electronic, hybrid, or other means). The section highlights the importance of assessing and mitigating risks to data integrity, especially in electronic systems, to ensure regulatory compliance.", "excerpt_keywords": "MHRA, data integrity, regulatory compliance, hybrid systems, electronic systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n#### mhra\n\nregulating medicines and medical devices\n\nsimple electronic systems wip no configurable software and no electronic data retention (e.g. ph meters, balances and permometers) may only require calibration, whereas complex systems require validation for intended purpose. validation effort increases wip complexity and risk (determined by software functionality, configuration, pe opportunity for user intervention and data lifecycle considerations). it is important not to overlook systems of apparent lower complexity. wipin pese systems, it may be possible to manipulate data or repeat testing to achieve pe desired outcome wip limited opportunity for detection (e.g. stand-alone systems wip a user-configurable output such as ecg machines, ftir, uv spectrophotometers).\nhybrid where hybrid systems are used, it should be clearly documented what constitutes pe whole data set and all records pat are defined by pe data set should be reviewed and retained. hybrid systems should be designed to ensure pey meet pe desired objective.\noper where pe data generated is captured by a photograph or imagery (or oper media), pe requirements for storage of pat format proughout its lifecycle should follow pe same considerations as for pe oper formats, considering any additional controls required for pat format. where pe original format cannot be retained due to degradation issues, alternative mechanisms for recording (e.g. photography or digitisation) and subsequent storage may be considered and pe selection rationale documented (e.g. pin layer chromatography).\n4.4 reduced effort and/or frequency of control measures may be justified for data pat has a lesser impact to product, patient or pe environment if pose data are obtained from a process pat does not provide pe opportunity for amendment wipout high-level system access or specialist software/knowledge.\n4.5 the data integrity risk assessment (or equivalent) should consider factors required to follow a process or perform a function. it is expected to consider not only a computerised system but also pe supporting people, guidance, training and quality systems. therefore, automation or pe use of a validated system (e.g. e-crf; analytical equipment) may lower but not eliminate data integrity risk. where pere is human intervention, particularly influencing how or what data is recorded, reported or retained, an increased risk may exist from poor organisational controls or data verification due to an overreliance on pe systems validated state.\n4.6 where pe data integrity risk assessment has highlighted areas for remediation, prioritisation of actions (including acceptance of an appropriate level of residual risk) should be documented, communicated to management, and subject to review. in situations where long-term remediation actions are identified, risk-reducing short-term measures should be implemented to provide acceptable data governance in pe interim.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 6 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "23cf1f26-a44b-45fb-84ea-3744a5fd1976": {"__data__": {"id_": "23cf1f26-a44b-45fb-84ea-3744a5fd1976", "embedding": null, "metadata": {"page_label": "7", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Strategies for Maintaining Data Integrity: System Design and Process Implementation\"", "questions_this_excerpt_can_answer": "1. What specific strategies does the MHRA GXP Data Integrity Guidance and Definitions document recommend for designing systems and processes to ensure data integrity, particularly in terms of recording timed events and managing user access rights?\n \n2. How does the MHRA GXP Data Integrity Guidance and Definitions document address the use of scribes in recording data for GXP activities, and what are the conditions under which their use is justified according to the document?\n\n3. What measures does the MHRA GXP Data Integrity Guidance and Definitions document suggest for preventing unauthorized data amendments and ensuring the traceability and reconstruction of data records across multiple sites?", "prev_section_summary": "The section discusses the considerations for managing data integrity in regulating medicines and medical devices, including the validation requirements for simple and complex electronic systems. It also addresses the handling of data generated from non-traditional formats like photographs or imagery, emphasizing the need for proper storage and documentation. The guidance suggests that reduced effort and frequency of control measures may be justified for data with lesser impact, but a thorough data integrity risk assessment is necessary. Prioritization of remediation actions and communication with management are also highlighted in the section.", "excerpt_keywords": "Data integrity, System design, Process implementation, User access rights, Scribes"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## designing systems and processes to assure data integrity; creating the right environment\n\n5.1 systems and processes should be designed in a way that facilitates compliance with the principles of data integrity. enablers of the desired behavior include but are not limited to:\n\n- at the point of use, having access to appropriately controlled/synchronized clocks for recording timed events to ensure reconstruction and traceability, knowing and specifying the time zone where this data is used across multiple sites.\n- accessibility of records at locations where activities take place so that informal data recording and later transcription to official records does not occur.\n- access to blank paper proformas for raw/source data recording should be appropriately controlled. reconciliation, or the use of controlled books with numbered pages, may be necessary to prevent recreation of a record. there may be exceptions such as medical records (gcp) where this is not practical.\n- user access rights that prevent (or audit trail, if prevention is not possible) unauthorized data amendments. use of external devices or system interfacing methods that eliminate manual data entries and human interaction with the computerized system, such as barcode scanners, id card readers, or printers.\n- the provision of a work environment (such as adequate space, sufficient time for tasks, and properly functioning equipment) that permit performance of tasks and recording of data as required.\n- access to original records for staff performing data review activities.\n- reconciliation of controlled print-outs.\n- sufficient training in data integrity principles provided to all appropriate staff (including senior management).\n- inclusion of subject matter experts in the risk assessment process.\n- management oversight of quality metrics relevant to data governance.\n\n5.2 the use of scribes to record activity on behalf of another operator can be considered where justified, for example:\n\n- the act of contemporaneous recording compromises the product or activity e.g. documenting line interventions by sterile operators.\n- necropsy (glp)\n- to accommodate cultural or literacy/language limitations, for instance where an activity is performed by an operator but witnessed and recorded by a second person.\n\nconsideration should be given to ease of access, usability, and location whilst ensuring appropriate control of the activity guided by the criticality of the data.\n\nin these situations, the recording by the second person should be contemporaneous with the task being performed, and the records should identify both the person performing the task and the person completing the record. the person performing the task should countersign the record wherever possible, although it is accepted that this countersigning step will be retrospective. the process for supervisory (scribe) documentation completion should be described in an approved procedure that specifies the activities to which the process applies.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 7 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "48480ece-854a-448f-b69d-c0ad1c7b5e5e": {"__data__": {"id_": "48480ece-854a-448f-b69d-c0ad1c7b5e5e", "embedding": null, "metadata": {"page_label": "8", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Understanding Data Integrity and Raw Data Definitions in GxP Regulations", "questions_this_excerpt_can_answer": "1. What are the specific characteristics that data must possess according to the MHRA GXP Data Integrity Guidance to ensure its integrity in GxP regulated activities?\n \n2. How does the MHRA GXP Data Integrity Guidance define \"raw data\" in the context of electronic and paper records, and what are the implications for data considered as raw data when captured by basic electronic equipment that does not store electronic data permanently?\n\n3. What measures does the MHRA GXP Data Integrity Guidance recommend to ensure data governance throughout the data lifecycle, and how are these measures defined in terms of data completeness, consistency, endurance, and availability?", "prev_section_summary": "The section discusses strategies for maintaining data integrity through system design and process implementation. Key topics include designing systems to facilitate compliance with data integrity principles, recording timed events accurately, managing user access rights, preventing unauthorized data amendments, and ensuring traceability of data records. The use of scribes in recording data for GXP activities is also addressed, with conditions for their justified use outlined. The section emphasizes the importance of creating the right environment for data integrity, including providing access to original records, sufficient training in data integrity principles, and management oversight of quality metrics.", "excerpt_keywords": "Data Integrity, GxP Regulations, Raw Data, MHRA Guidance, Electronic Records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## definition of terms and interpretation of requirements\n\nin the following section, definitions where applicable, are given in italic text directly below the term.\n\n### 6.1. data\n\nfacts, figures and statistics collected together for reference or analysis. all original records and true copies of original records, including source data and metadata and all subsequent transformations and reports of these data, that are generated or recorded at the time of the gxp activity and allow full and complete reconstruction and evaluation of the gxp activity.\n\ndata should be:\n\n- a - attributable to the person generating the data\n- l - legible and permanent\n- c - contemporaneous\n- o - original record (or certified true copy)\n- a - accurate\n\ndata governance measures should also ensure that data is complete, consistent, enduring, and available throughout the lifecycle, where:\n\n- complete - the data must be whole; a complete set\n- consistent - the data must be self-consistent\n- enduring - durable; lasting throughout the data lifecycle\n- available - readily available for review or inspection purposes\n\n### 6.2. raw data (synonymous with source data which is defined in ich gcp)\n\nraw data is defined as the original record (data) which can be described as the first-capture of information, whether recorded on paper or electronically. information that is originally captured in a dynamic state should remain available in that state.\n\nraw data must permit full reconstruction of the activities. where this has been captured in a dynamic state and generated electronically, paper copies cannot be considered as raw data.\n\nin the case of basic electronic equipment that does not store electronic data, or provides only a printed data output (e.g. balances or ph meters), then the printout constitutes the raw data.\n\nwhere the basic electronic equipment does store electronic data permanently and only holds a certain volume before overwriting; this data should be periodically reviewed and where necessary reconciled against paper records and extracted as electronic data where this is supported by the equipment itself.\n\nin all definitions, the term data includes raw data.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 8 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bd053438-7e91-4bca-a4f4-c55e9336f3b6": {"__data__": {"id_": "bd053438-7e91-4bca-a4f4-c55e9336f3b6", "embedding": null, "metadata": {"page_label": "9", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Best Practices for Ensuring Data Integrity and Governance in Metadata Management\"", "questions_this_excerpt_can_answer": "1. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document define the role and importance of metadata in the context of data integrity and governance within pharmaceutical development and quality assurance processes?\n\n2. What specific examples does the MHRA GXP Data Integrity Guidance and Definitions (2018) provide to illustrate how metadata gives context and meaning to data within the pharmaceutical industry, and how does this support the principles of data integrity?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions (2018), what are the key components and principles of data governance that organizations should implement to ensure the integrity and reliability of their data throughout its lifecycle, particularly in the context of pharmaceutical research and manufacturing?", "prev_section_summary": "This section focuses on defining key terms related to data integrity in GxP regulated activities, such as \"data\" and \"raw data.\" It outlines specific characteristics that data must possess to ensure its integrity, including being attributable, legible, contemporaneous, original, and accurate. The section also discusses the definition of raw data, emphasizing that it is the original record of information captured in a dynamic state, whether on paper or electronically. It addresses the implications for data considered as raw data when captured by basic electronic equipment that does not store electronic data permanently. Additionally, the section highlights the importance of data governance measures to ensure data completeness, consistency, endurance, and availability throughout the data lifecycle.", "excerpt_keywords": "metadata, data integrity, data governance, pharmaceutical industry, MHRA GXP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## 6.3. metadata\n\nmetadata are data that describe the attributes of other data and provide context and meaning. typically, these are data that describe the structure, data elements, inter-relationships and other characteristics of data e.g. audit trails. metadata also permit data to be attributable to an individual (or if automatically generated, to the original data source). metadata form an integral part of the original record. without the context provided by metadata the data has no meaning.\n\nexample (i) 3.5 metadata, giving context and meaning, (italic text) are:\n\nsodium chloride batch 1234, 3.5mg. j smip 01/jul/14\n\nexample (ii) 3.5 metadata, giving context and meaning, (italic text) are:\n\ntrial subject a123, sample ref x789 taken 30/06/14 at 1456hrs. 3.5mg. analyst: j smip 01/jul/14\n\n## 6.4. data integrity\n\ndata integrity is the degree to which data are complete, consistent, accurate, trustworthy, reliable and that these characteristics of the data are maintained throughout the data life cycle. the data should be collected and maintained in a secure manner, so that they are attributable, legible, contemporaneously recorded, original (or a true copy) and accurate. assuring data integrity requires appropriate quality and risk management systems, including adherence to sound scientific principles and good documentation practices.\n\n## 6.5. data governance\n\nthe arrangements to ensure that data, irrespective of the format in which they are generated, are recorded, processed, retained and used to ensure the record throughout the data lifecycle. data governance should address data ownership and accountability throughout the lifecycle, and consider the design, operation and monitoring of processes/systems to comply with the principles of data integrity including control over intentional and unintentional changes to data. data governance systems should include staff training in the importance of data integrity principles and the creation of a working environment that enables visibility, and actively encourages reporting of errors, omissions and undesirable results. senior management should be accountable for the implementation of systems and procedures to minimise the potential risk to data integrity, and for identifying the residual risk, using risk management techniques such as the principles of ich q9.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 9 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "da968890-4b13-4459-955a-2681f559ebb3": {"__data__": {"id_": "da968890-4b13-4459-955a-2681f559ebb3", "embedding": null, "metadata": {"page_label": "10", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Governance in Contractual Agreements and Data Lifecycle Management: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What specific responsibilities do contract givers have regarding data ownership, governance, and accessibility when entering into agreements with third parties, according to the MHRA GXP Data Integrity Guidance and Definitions (2018)?\n\n2. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document define the data lifecycle, and what are the key stages involved in ensuring data integrity throughout this lifecycle?\n\n3. What guidelines does the MHRA GXP Data Integrity Guidance and Definitions (2018) provide for the recording and collection of data to ensure its accuracy, completeness, and appropriateness for intended use, especially in the context of electronic systems and dynamic storage?", "prev_section_summary": "The section discusses the importance of metadata in providing context and meaning to data, examples of metadata in the pharmaceutical industry, the definition and importance of data integrity, and the principles of data governance. Metadata describes the attributes of data, ensuring data integrity involves maintaining complete, consistent, and accurate data, and data governance involves recording, processing, and retaining data throughout its lifecycle while addressing ownership, accountability, and compliance with data integrity principles. Senior management is accountable for implementing systems to minimize risks to data integrity.", "excerpt_keywords": "data ownership, governance, accessibility, data lifecycle, data integrity, data transfer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\ncontract givers should ensure that data ownership, governance and accessibility are included in any contract/technical agreement with a third party. the contract giver should also perform a data governance review as part of their vendor assurance programme. data governance systems should also ensure that data are readily available and directly accessible on request from national competent authorities. electronic data should be available in human-readable form.\n\ndata lifecycle\n\nall phases in the life of the data from generation and recording through processing (including analysis, transformation or migration), use, data retention, archive/retrieval and destruction. data governance, as described in the previous section, must be applied across the whole data lifecycle to provide assurance of data integrity. data can be retained either in the original system, subject to suitable controls, or in an appropriate archive.\n\nrecording and collection of data\n\nno definition required. organizations should have an appropriate level of process understanding and technical knowledge of systems used for data collection and recording, including their capabilities, limitations and vulnerabilities. the selected method should ensure that data of appropriate accuracy, completeness, content and meaning are collected and retained for their intended use. where the capability of the electronic system permits dynamic storage, it is not appropriate for static (printed / manual) data to be retained in preference to dynamic (electronic) data. as data are required to allow the full reconstruction of activities the amount and the resolution (degree of detail) of data to be collected should be justified. when used, blank forms (including, but not limited to, worksheets, laboratory notebooks, and master production and control records) should be controlled. for example, numbered sets of blank forms may be issued and reconciled upon completion. similarly, bound paginated notebooks, stamped or formally issued by a document control group allow detection of unofficial notebooks and any gaps in notebook pages.\n\ndata transfer / migration\n\ndata transfer is the process of transferring data between different data storage types, formats, or computerized systems. data migration is the process of moving stored data from one durable storage location to another. this may include changing the format of data, but not the content or meaning. data transfer is the process of transferring data and metadata between storage media types or computerized systems. data migration where required may, if necessary, change the format of data to make it usable or visible on an alternative computerized system.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 10 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5ef4be60-6d44-4f84-be26-b76fea0880a3": {"__data__": {"id_": "5ef4be60-6d44-4f84-be26-b76fea0880a3", "embedding": null, "metadata": {"page_label": "11", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Management in GxP Compliance: Transfer, Migration, Processing, Exclusion, and Original Record Maintenance", "questions_this_excerpt_can_answer": "1. What specific considerations should be taken into account to ensure data integrity during the lifecycle of data transfer and migration according to the MHRA GXP Data Integrity Guidance and Definitions (2018)?\n\n2. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document suggest handling data processing activities to maintain traceability and ensure the integrity of processing parameters?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions (2018), under what conditions can data be excluded from analysis, and what documentation is required to justify such exclusions?", "prev_section_summary": "The section discusses the responsibilities of contract givers in ensuring data ownership, governance, and accessibility in agreements with third parties. It also defines the data lifecycle, including stages such as data generation, processing, retention, and destruction. Guidelines for recording and collecting data to ensure accuracy, completeness, and appropriateness for use are provided, especially in electronic systems. The section also covers data transfer and migration processes. Key entities mentioned include contract givers, data governance systems, electronic data, data lifecycle, data recording and collection methods, data transfer, and data migration.", "excerpt_keywords": "Data Integrity, GxP Compliance, Data Transfer, Data Processing, Excluding Data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## data transfer/migration procedures\n\nprocedures should include a rationale and be robustly designed and validated to ensure that data integrity is maintained during the data lifecycle. careful consideration should be given to understanding the data format and the potential for alteration at each stage of data generation, transfer, and subsequent storage. the challenges of migrating data are often underestimated, particularly regarding maintaining the full meaning of the migrated records.\n\ndata transfer should be validated. the data should not be altered during or after it is transferred to the worksheet or other application. there should be an audit trail for this process. appropriate quality procedures should be followed if the data transfer during the operation has not occurred correctly. any changes in the middle layer software should be managed through appropriate quality management systems.\n\nelectronic worksheets used in automation like paper documentation should be version controlled, and any changes in the worksheet should be documented/verified appropriately.\n\n## data processing\n\na sequence of operations performed on data to extract, present, or obtain information in a defined format. examples might include: statistical analysis of individual patient data to present trends or conversion of a raw electronic signal to a chromatogram and subsequently a calculated numerical result.\n\nthere should be adequate traceability of any user-defined parameters used within data processing activities to the raw data, including attribution to who performed the activity. audit trails and retained records should allow reconstruction of all data processing activities regardless of whether the output of that processing is subsequently reported or otherwise used for regulatory or business purposes. if data processing has been repeated with progressive modification of processing parameters, this should be visible to ensure that the processing parameters are not being manipulated to achieve a more desirable result.\n\n## excluding data (not applicable to gpvp)\n\nnote: this is not applicable to gpvp; for gpvp refer to the pharmacovigilance legislation (including the gvp modules) which provide the necessary requirements and statutory guidance.\n\ndata may only be excluded where it can be demonstrated through valid scientific justification that the data are not representative of the quantity measured, sampled, or acquired. in all cases, this justification should be documented and considered during data review and reporting. all data (even if excluded) should be retained with the original data set and be available for review in a format that allows the validity of the decision to exclude the data to be confirmed.\n\n## original record and true copy\n\n6.11.1. original record\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 11 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f1b2a0e4-a27a-4627-8cbb-3aa7ca126dcb": {"__data__": {"id_": "f1b2a0e4-a27a-4627-8cbb-3aa7ca126dcb", "embedding": null, "metadata": {"page_label": "12", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and True Copies in GxP Activities: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What distinguishes a static record format from a dynamic record format in the context of GxP activities, according to the MHRA GXP Data Integrity Guidance and Definitions (2018)?\n \n2. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document recommend handling situations where it is not practical or possible to retain the original copy of source data, such as MRI scans not under the study sponsor's control?\n\n3. What are the requirements for a copy to be considered a \"true copy\" according to the MHRA GXP Data Integrity Guidance and Definitions (2018), especially in terms of preserving the metadata and audit trail of the original record?", "prev_section_summary": "This section discusses data transfer and migration procedures, data processing activities, and the exclusion of data in the context of GxP compliance and data integrity. Key topics include the importance of maintaining data integrity throughout the data lifecycle, validation of data transfer, traceability of processing parameters, and the justification and documentation required for excluding data. The section also emphasizes the need for version control, documentation, and audit trails to ensure the integrity of data processing activities. Additionally, it highlights the requirements for retaining original records and true copies in compliance with MHRA GXP Data Integrity Guidance and Definitions.", "excerpt_keywords": "data integrity, GxP activities, true copy, metadata, audit trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## the first or source capture of data or information\n\ne.g. original paper record of manual observation or electronic raw data file from a computerised system, and all subsequent data required to fully reconstruct the conduct of the gxp activity. original records can be static or dynamic.\n\n## a static record format\n\nsuch as a paper or electronic record, is one that is fixed and allows little or no interaction between the user and the record content. for example, once printed or converted to static electronic format chromatography records lose the capability of being reprocessed or enabling more detailed viewing of baselines.\n\n## records in dynamic format\n\nsuch as electronic records, allow an interactive relationship between the user and the record content. for example, electronic records in database formats allow the user to track, trend and query data; chromatography records maintained as electronic records allow the user or reviewer (with appropriate access permissions) to reprocess the data and expand the baseline to view the integration more clearly.\n\nwhere it is not practical or feasibly possible to retain the original copy of source data, (e.g. mri scans, where the source machine is not under the study sponsors control and the operator can only provide summary statistics) the risks and mitigation should be documented.\n\nwhere the data obtained requires manual observation to record (for example results of a manual titration, visual interpretation of environmental monitoring plates) the process should be risk assessed and depending on the criticality, justify if a second contemporaneous verification check is required or investigate if the result could be captured by an alternate means.\n\n## true copy\n\na copy (irrespective of the type of media used) of the original record that has been verified (i.e. by a dated signature or by generation through a validated process) to have the same information, including data that describe the context, content, and structure, as the original.\n\na true copy may be stored in a different electronic file format to the original record if required, but must retain the metadata and audit trail required to ensure that the full meaning of the data are kept and its history may be reconstructed.\n\noriginal records and true copies must preserve the integrity of the record. true copies of original records may be retained in place of the original record (e.g. scan of a paper record), if a documented system is in place to verify and record the integrity of the copy. organizations should consider any risk associated with the destruction of original records.\n\nit should be possible to create a true copy of electronic data, including relevant metadata, for the purposes of review, backup and archival. accurate and complete copies for certification of the copy should include the meaning of the data (e.g. date formats, context, layout, electronic signatures and authorizations) and the full gxp audit trail. consideration should be given to the dynamic functionality of a true copy throughout the retention period (see archive).\n\ndata must be retained in a dynamic form where this is critical to its integrity or later verification. if the computerized system cannot be maintained e.g., if it is no longer supported, then records should be archived according to a documented archiving strategy prior to mhra gxp data integrity guidance and definitions; revision 1: march 2018 page 12 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a4377e1d-fe2f-4be9-ae2f-e4726d3bbf2d": {"__data__": {"id_": "a4377e1d-fe2f-4be9-ae2f-e4726d3bbf2d", "embedding": null, "metadata": {"page_label": "13", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Audit Trail Compliance in Computerised Systems for Regulating Medicines and Medical Devices", "questions_this_excerpt_can_answer": "1. What specific elements must be included in the data retention process to ensure compliance with MHRA GXP Data Integrity Guidance when decommissioning a computerised system used for regulating medicines and medical devices?\n\n2. According to the MHRA GXP Data Integrity Guidance, how should a computerised system transaction be defined, and what are the requirements for saving such transactions to ensure they are captured in the system's audit trail?\n\n3. What does the MHRA GXP Data Integrity Guidance specify about the creation and maintenance of audit trails in computerised systems used for the regulation of medicines and medical devices, particularly regarding the information that must be recorded to facilitate the reconstruction of the history of GXP records?", "prev_section_summary": "The section discusses the concepts of static and dynamic record formats in the context of GxP activities, emphasizing the importance of retaining original source data and true copies. It outlines the differences between static records (fixed and limited interaction) and dynamic records (allowing user interaction), and provides guidance on handling situations where retaining the original copy of source data is not practical. The document defines a true copy as a verified copy of the original record that maintains metadata and audit trail integrity. It also highlights the requirements for preserving data integrity, including considerations for electronic data, metadata, and audit trails throughout the retention period. The section stresses the importance of maintaining the dynamic functionality of true copies and archiving records according to a documented strategy to ensure data integrity.", "excerpt_keywords": "MHRA, GXP, Data Integrity, Audit Trail, Computerised Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## mhra\n\nregulating medicines and medical devices\n\ndecommissioning the computerised system. it is conceivable for some data generated by electronic means to be retained in an acceptable paper or electronic format, where it can be justified that a static record maintains the integrity of the original data. however, the data retention process must be shown to include verified copies of all raw data, metadata, relevant audit trail and result files, any variable software/system configuration settings specific to each record, and all data processing runs (including methods and audit trails) necessary for reconstruction of a given raw data set. it would also require a documented means to verify that the printed records were an accurate representation. to enable a gxp compliant record, this approach is likely to be demanding in its administration.\n\nwhere manual transcriptions occur, these should be verified by a second person or validated system.\n\n### 6.12. computerised system transactions:\n\na computerised system transaction is a single operation or sequence of operations performed as a single logical unit of work. the operation(s) that make a transaction may not be saved as a permanent record on durable storage until the user commits the transaction through a deliberate act (e.g. pressing a save button), or until the system forces the saving of data. the metadata (e.g. username, date, and time) are not captured in the system audit trail until the user saves the transaction to durable storage. in computerised systems, an electronic signature may be required for the record to be saved and become permanent.\n\na critical step is a parameter that must be within an appropriate limit, range, or distribution to ensure the safety of the subject or quality of the product or data. computer systems should be designed to ensure that the execution of critical steps is recorded contemporaneously. where transactional systems are used, the combination of multiple unit operations into a combined single transaction should be avoided, and the time intervals before saving of data should be minimized. systems should be designed to require saving data to permanent memory before prompting users to make changes.\n\nthe organization should define during the development of the system (e.g. via the user requirements specification) what critical steps are appropriate based on the functionality of the system and the level of risk associated. critical steps should be documented with process controls that consider system design (prevention), together with monitoring and review processes. oversight of activities should alert to failures that are not addressed by the process design.\n\n### 6.13. audit trail\n\nthe audit trail is a form of metadata containing information associated with actions that relate to the creation, modification or deletion of gxp records. an audit trail provides for secure recording of life-cycle details such as creation, additions, deletions or alterations of information in a record, either paper or electronic, without obscuring or overwriting the original record. an audit trail facilitates the reconstruction of the history of such events relating to the record regardless of its medium, including the \"who, what, when and why\" of the action.\n\nwhere computerised systems are used to capture, process, report, store or archive raw data electronically, system design should always provide for the retention of audit trails to show all\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 13 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cc4abb48-14bd-4b06-8a5b-681e65453287": {"__data__": {"id_": "cc4abb48-14bd-4b06-8a5b-681e65453287", "embedding": null, "metadata": {"page_label": "14", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Electronic Signatures in the Regulation of Medicines and Medical Devices", "questions_this_excerpt_can_answer": "1. What specific measures does the MHRA GXP Data Integrity Guidance and Definitions document recommend for ensuring that changes to data in the regulation of medicines and medical devices are traceable and accountable?\n \n2. How does the MHRA document address the issue of audit trails in systems that lack the necessary functionality, particularly in the context of legacy systems within the pharmaceutical and medical device industries?\n\n3. What are the MHRA's expectations regarding the use of electronic signatures in GMP facilities, especially in relation to industrial automation and control equipment/systems, as outlined in the 2018 guidance document?", "prev_section_summary": "This section discusses the importance of data retention processes when decommissioning a computerized system used for regulating medicines and medical devices. It emphasizes the need for verified copies of raw data, metadata, audit trails, and result files to ensure data integrity. The section also defines computerized system transactions and outlines requirements for saving transactions in the system's audit trail. Additionally, it covers the creation and maintenance of audit trails in computerized systems, highlighting the information that must be recorded for the reconstruction of GXP records. Key topics include data retention, computerized system transactions, critical steps, audit trails, and system design considerations for ensuring data integrity and audit trail compliance. Key entities mentioned are MHRA (Medicines and Healthcare products Regulatory Agency) and GXP (Good Practices).", "excerpt_keywords": "MHRA, GXP, Data Integrity, Audit Trails, Electronic Signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## mhra\n\nregulating medicines and medical devices\n\nchanges to, or deletion of data while retaining previous and original data. it should be possible to associate all data and changes to data with the persons making those changes, and changes should be dated and time stamped (time and time zone where applicable). the reason for any change, should also be recorded. the items included in the audit trail should be those of relevance to permit reconstruction of the process or activity.\n\naudit trails (identified by risk assessment as required) should be switched on. users should not be able to amend or switch off the audit trail. where a system administrator amends, or switches off the audit trail a record of that action should be retained.\n\nthe relevance of data retained in audit trails should be considered by the organisation to permit robust data review/verification. it is not necessary for audit trail review to include every system activity (e.g. user log on/off, keystrokes etc.).\n\nwhere relevant audit trail functionality does not exist (e.g. within legacy systems) an alternative control may be achieved for example defining the process in an sop, and use of log books. alternative controls should be proven to be effective.\n\nwhere add-on software or a compliant system does not currently exist, continued use of the legacy system may be justified by documented evidence that a compliant solution is being sought and that mitigation measures temporarily support the continued use.\n\nroutine data review should include a documented audit trail review where this is determined by a risk assessment. when designing a system for review of audit trails, this may be limited to those with gxp relevance. audit trails may be reviewed as a list of relevant data, or by an exception reporting process. an exception report is a validated search tool that identifies and documents predetermined abnormal data or actions, that require further attention or investigation by the data reviewer.\n\nreviewers should have sufficient knowledge and system access to review relevant audit trails, raw data and metadata (see also data governance).\n\nwhere systems do not meet the audit trail and individual user account expectations, demonstrated progress should be available to address these shortcomings. this should either be through add-on software that provides these additional functions or by an upgrade to a compliant system. where remediation has not been identified or subsequently implemented in a timely manner a deficiency may be cited.\n\n## 6.14. electronic signatures\n\na signature in digital form (bio-metric or non-biometric) that represents the signatory. this should be equivalent in legal terms to the handwritten signature of the signatory.\n\nthe use of electronic signatures should be appropriately controlled with consideration given to:\n\n- how the signature is attributable to an individual.\n\nit is expected that gmp facilities with industrial automation and control equipment/systems such as programmable logic controllers should be able to demonstrate working towards system upgrades with individual login and audit trails (reference: art 23 of directive 2001/83/ec).\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 14 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d6da7767-6165-415f-a3ea-82df581f8b3a": {"__data__": {"id_": "d6da7767-6165-415f-a3ea-82df581f8b3a", "embedding": null, "metadata": {"page_label": "15", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Electronic Signature and Data Review in MHRA Regulations: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document specify the recording and security measures for electronic signatures to ensure they cannot be altered or manipulated without invalidating the signature or the status of the entry?\n\n2. What are the MHRA's expectations regarding the maintenance of metadata associated with electronic signatures, especially when a paper or PDF copy of an electronically signed document is produced, as outlined in the 2018 guidance document?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions (2018), what procedures and principles should be followed for the review and approval of data, including the handling of errors or omissions identified during the data review process?", "prev_section_summary": "The section discusses the importance of data integrity in the regulation of medicines and medical devices, specifically focusing on traceability and accountability of data changes. It emphasizes the need for audit trails to track data changes, with specific requirements for retaining original data, associating changes with individuals, and recording reasons for changes. The document also addresses the issue of audit trails in systems lacking necessary functionality, suggesting alternative controls such as defining processes in SOPs or using log books. Additionally, the section touches on electronic signatures, highlighting the need for them to be equivalent to handwritten signatures and appropriately controlled in GMP facilities with industrial automation and control equipment/systems. The MHRA GXP Data Integrity Guidance and Definitions document from March 2018 is referenced throughout the section.", "excerpt_keywords": "MHRA, Data Integrity, Electronic Signatures, Audit Trails, Metadata"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## mhra\n\nregulating medicines and medical devices\n\n* how the act of signing is recorded within the system so that it cannot be altered or manipulated without invalidating the signature or status of the entry.\n\n* how the record of the signature will be associated with the entry made and how this can be verified.\n\n* the security of the electronic signature i.e. so that it can only be applied by the owner of that signature.\n\nit is expected that appropriate validation of the signature process associated with a system is undertaken to demonstrate suitability and that control over signed records is maintained.\n\nwhere a paper or pdf copy of an electronically signed document is produced, the metadata associated with an electronic signature should be maintained with the associated document.\n\nthe use of electronic signatures should be compliant with the requirements of international standards. the use of advanced electronic signatures should be considered where this method of authentication is required by the risk assessment. electronic signature or e-signature systems must provide for \"signature manifestations\" i.e. a display within the viewable record that defines who signed it, their title, and the date (and time, if significant) and the meaning of the signature (e.g. verified or approved).\n\nan inserted image of a signature or a footnote indicating that the document has been electronically signed (where this has been entered by a means other than the validated electronic signature process) is not adequate. where a document is electronically signed then the metadata associated with the signature should be retained.\n\nfor printed copies of electronically signed documents refer to true copy section.\n\nexpectations for electronic signatures associated with informed consent (gcp) are covered in alternative guidance (mhra/hra draft guidance on the use of electronic consent).\n\n## 6.15. data review and approval\n\nthe approach to reviewing specific record content, such as critical data and metadata, cross-outs (paper records) and audit trails (electronic records) should meet all applicable regulatory requirements and be risk-based.\n\nthere should be a procedure that describes the process for review and approval of data. data review should also include a risk-based review of relevant metadata, including relevant audit trails records. data review should be documented and the record should include a positive statement regarding whether issues were found or not, the date that review was performed and the signature of the reviewer.\n\na procedure should describe the actions to be taken if data review identifies an error or omission. this procedure should enable data corrections or clarifications to provide visibility of the original record, and traceability of the correction, using alcoa principles (see data definition).\n\nwhere data review is not conducted by the organisation that generated the data, the responsibilities for data review must be documented and agreed by both parties.\n\nsummary mhra gxp data integrity guidance and definitions; revision 1: march 2018 page 15 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7b8554d4-6f72-4e3d-a971-42029930f6ac": {"__data__": {"id_": "7b8554d4-6f72-4e3d-a971-42029930f6ac", "embedding": null, "metadata": {"page_label": "16", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in Computerised Systems for Regulating Medicines and Medical Devices: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures does the MHRA GXP Data Integrity Guidance recommend for ensuring data integrity when reports are shared between organizations, such as contract givers and acceptors?\n \n2. According to the MHRA GXP Data Integrity Guidance, how should organizations handle user access and system administrator roles in computerized systems to maintain data integrity, especially in relation to GXP data?\n\n3. What are the MHRA's recommendations for dealing with computerized systems that support only a single user login or a limited number of user logins, particularly when no suitable alternative system is available for ensuring data integrity?", "prev_section_summary": "The section discusses the requirements and expectations outlined by the MHRA GXP Data Integrity Guidance and Definitions (2018) regarding electronic signatures and data review in regulated environments. Key topics include recording and security measures for electronic signatures, maintenance of metadata, validation of signature processes, use of advanced electronic signatures, and procedures for data review and approval. Entities mentioned include the MHRA, electronic signature systems, metadata, audit trails, and the responsibilities for data review.", "excerpt_keywords": "MHRA, GXP, Data Integrity, Computerized Systems, Electronic Signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## mhra regulating medicines and medical devices\n\nreports of data are often supplied between organisations (contract givers and acceptors). it must be acknowledged that summary reports are limited and critical supporting data and metadata may not be included.\n\nmany software packages allow configuration of customised reports. key actions may be incorporated into such reports provided they are validated and locked to prevent changes. automated reporting tools and reports may reduce the checks required to assure the integrity of the data.\n\nwhere summary reports are supplied by a different organisation, the organisation receiving and using the data should evaluate the data providers data integrity controls and processes prior to using the information.\n\n- routine data review should consider the integrity of an individual data set e.g. is this the only data generated as part of this activity? has the data been generated and maintained correctly? are there indicators of unauthorised changes?\n- periodic audit of the data generated (encompassing both a review of electronically generated data and the broader organisational review) might verify the effectiveness of existing control measures and consider the possibility of unauthorised activity at all interfaces, e.g. have there been it requests to amend any data post review? have there been any system maintenance activities and has the impact of that activity been assessed?\n\n6.16. computerised system user access/system administrator roles\n\nfull use should be made of access controls to ensure that people have access only to functionality that is appropriate for their job role, and that actions are attributable to a specific individual. companies must be able to demonstrate the access levels granted to individual staff members and ensure that historical information regarding user access level is available.\n\nwhere the system does not capture this data, then a record must be maintained outside of the system. access controls should be applied to both the operating system and application levels. individual login at operating system level may not be required if appropriate controls are in place to ensure data integrity (e.g. no modification, deletion or creation of data outside the application is possible).\n\nfor systems generating, amending or storing gxp data shared logins or generic user access should not be used. where the computerised system design supports individual user access, this function must be used. this may require the purchase of additional licences. systems (such as mrp systems) that are not used in their entirety for gxp purposes but do have elements within them, such as approved suppliers, stock status, location and transaction histories that are gxp applicable require appropriate assessment and control.\n\nit is acknowledged that some computerised systems support only a single user login or limited numbers of user logins. where no suitable alternative computerised system is available, equivalent control may be provided by third-party software or a paper-based method of providing traceability (with version control). the suitability of alternative systems should be justified and documented. increased data review is likely to be required for hybrid systems.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 16 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7d85cbf6-d778-4905-9080-8183ecc08727": {"__data__": {"id_": "7d85cbf6-d778-4905-9080-8183ecc08727", "embedding": null, "metadata": {"page_label": "17", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Access Control in Regulated Environments: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures does the MHRA GXP Data Integrity Guidance and Definitions (2018) recommend for restricting system administrator access within regulated environments to ensure data integrity?\n \n2. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document suggest handling data retention to protect records from alteration or loss, and what validation processes are recommended for ensuring the integrity of retained data?\n\n3. According to the MHRA GXP Data Integrity Guidance and Definitions (2018), what considerations should be made when assigning system administrator rights to prevent conflicts of interest and unauthorized data changes, especially in the context of clinical trial data management?", "prev_section_summary": "The key topics of this section include ensuring data integrity in computerized systems for regulating medicines and medical devices, sharing reports between organizations, user access and system administrator roles in computerized systems, and dealing with computerized systems that support only a single user login or limited user logins. The section emphasizes the importance of validating and locking custom reports, evaluating data providers' data integrity controls, implementing access controls to ensure appropriate access levels, and justifying the use of alternative systems when necessary. The MHRA GXP Data Integrity Guidance provides recommendations for maintaining data integrity in various aspects of computerized systems.", "excerpt_keywords": "MHRA, data integrity, system administrator access, data retention, audit trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## mhra\n\nregulating medicines and medical devices\n\n|2.|because they are vulnerable to non-attributable data changes. it is expected that companies should be implementing systems that comply with current regulatory expectations|\n|---|---|\n| |system administrator access should be restricted to the minimum number of people possible taking account of the size and nature of the organisation. the generic system administrator account should not be available for routine use. personnel with system administrator access should log in with unique credentials that allow actions in the audit trail(s) to be attributed to a specific individual. the intent of this is to prevent giving access to users with potentially a conflict of interest so that they can make unauthorised changes that would not be traceable to that person.|\n| |system administrator rights (permitting activities such as data deletion, database amendment or system configuration changes) should not be assigned to individuals with a direct interest in the data (data generation, data review or approval).|\n| |individuals may require changes in their access rights depending on the status of clinical trial data. for example, once data management processes are complete, the data is locked by removing editing access rights. this should be able to be demonstrated within the system.|\n\n6.17. data retention\n\ndata retention may be for archiving (protected data for long-term storage) or backup (data for the purposes of disaster recovery).\n\ndata and document retention arrangements should ensure the protection of records from deliberate or inadvertent alteration or loss. secure controls must be in place to ensure the data integrity of the record throughout the retention period and should be validated where appropriate (see also data transfer/migration).\n\ndata (or a true copy) generated in paper format may be retained by using a validated scanning process provided there is a documented process in place to ensure that the outcome is a true copy.\n\nprocedures for destruction of data should consider data criticality and where applicable legislative retention requirements.\n\n2 it is expected that gmp facilities with industrial automation and control equipment/systems such as programmable logic controllers should be able to demonstrate working towards system upgrades with individual login and audit trails (reference: art 23 of directive 2001/83/ec).\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 17 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0fe3dabe-68dd-4e4a-92d9-555a6ce7b73d": {"__data__": {"id_": "0fe3dabe-68dd-4e4a-92d9-555a6ce7b73d", "embedding": null, "metadata": {"page_label": "18", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Retention, Verification, and Recovery: Archive and Backup Processes Guide", "questions_this_excerpt_can_answer": "1. What specific measures are recommended by the MHRA GXP Data Integrity Guidance for ensuring the long-term protection and recovery of archived data and metadata, especially in the context of electronic data and legacy systems?\n\n2. How does the MHRA GXP Data Integrity Guidance differentiate between the purposes and validation requirements of data archiving and backup processes, particularly in relation to disaster recovery and the verification of processes or activities?\n\n3. What considerations does the MHRA GXP Data Integrity Guidance suggest for maintaining accessibility to data stored in legacy systems, including the potential need for data migration and the balance of risk between long-term accessibility and the preservation of dynamic data functionality?", "prev_section_summary": "The section discusses the importance of restricting system administrator access in regulated environments to prevent unauthorized data changes, recommending unique credentials for system administrators to ensure traceability of actions. It also emphasizes the need to avoid conflicts of interest by not assigning system administrator rights to individuals directly involved in data generation or review. Additionally, the section addresses data retention practices for protecting records from alteration or loss, highlighting the importance of secure controls and validation processes throughout the retention period. The document also mentions the validation of scanning processes for retaining paper-based data and considerations for data destruction based on criticality and legislative requirements. Overall, the section focuses on ensuring data integrity and access control in regulated environments, as outlined in the MHRA GXP Data Integrity Guidance and Definitions (2018).", "excerpt_keywords": "Data Integrity, MHRA, GXP, Archive, Backup"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## 6.17.1. archive\n\na designated secure area or facility (e.g. cabinet, room, building or computerised system) for the long term, retention of data and metadata for the purposes of verification of the process or activity. archived records may be the original record or a true copy and should be protected so they cannot be altered or deleted without detection and protected against any accidental damage such as fire or pest. archive arrangements must be designed to permit recovery and readability of the data and metadata throughout the required retention period. in the case of archiving of electronic data, this process should be validated, and in the case of legacy systems the ability to review data periodically verified (i.e. to confirm the continued support of legacy computerised systems). where hybrid records are stored, references between physical and electronic records must be maintained such that full verification of events is possible throughout the retention period. when legacy systems can no longer be supported, consideration should be given to maintaining the software for data accessibility purposes (for as long possible depending upon the specific retention requirements). this may be achieved by maintaining software in a virtual environment. migration to an alternative file format that retains as much as possible of the true copy attributes of the data may be necessary with increasing age of the legacy data. where migration with full original data functionality is not technically possible, options should be assessed based on risk and the importance of the data over time. the migration file format should be selected considering the balance of risk between long-term accessibility versus the possibility of reduced dynamic data functionality (e.g. data interrogation, trending, re-processing etc). it is recognised that the need to maintain accessibility may require migration to a file format that loses some attributes and/or dynamic data functionality (see also data migration).\n\n## 6.17.2. backup\n\na copy of current (editable) data, metadata and system configuration settings maintained for recovery including disaster recovery. backup and recovery processes should be validated and periodically tested. each back up should be verified to ensure that it has functioned correctly e.g. by confirming that the data size transferred matches that of the original record. the backup strategies for the data owners should be documented. backups for recovery purposes do not replace the need for the long term, retention of data and metadata in its final form for the purposes of verification of the process or activity.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 18 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "78d0b9da-712f-4fe9-95b4-b9217cc64349": {"__data__": {"id_": "78d0b9da-712f-4fe9-95b4-b9217cc64349", "embedding": null, "metadata": {"page_label": "19", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Compliance in Regulated Environments", "questions_this_excerpt_can_answer": "1. What are the primary types of file structures mentioned in the MHRA GXP Data Integrity Guidance and Definitions (2018) document, and how do their attributes influence the need for different controls and data review methods?\n\n2. According to the MHRA GXP Data Integrity Guidance and Definitions (2018), why is vendor-supplied validation data considered insufficient in isolation for ensuring the performance qualification of computerized systems in regulated environments?\n\n3. How does the MHRA GXP Data Integrity Guidance and Definitions (2018) document recommend managing data integrity, ownership, and security concerns when utilizing cloud or virtual services, including the considerations for contracts with service providers?", "prev_section_summary": "This section discusses the concepts of archive and backup processes in the context of data retention, verification, and recovery. Key topics include the definition of archive as a secure area for long-term retention of data and metadata, the validation of archiving processes for electronic data and legacy systems, the maintenance of accessibility to data stored in legacy systems through potential data migration, and the selection of file formats for data migration based on the balance of risk between long-term accessibility and dynamic data functionality. The section also defines backup as a copy of current data, metadata, and system settings maintained for recovery purposes, emphasizing the need for validation and periodic testing of backup processes. The importance of documenting backup strategies and verifying the correctness of backups is highlighted, with the reminder that backups do not replace the long-term retention of data and metadata for verification purposes.", "excerpt_keywords": "file structure, validation, computerized systems, cloud providers, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## 6.18. file structure\n\nregulating medicines and medical devices\n\ndata integrity risk assessment requires a clear understanding of file structure. the way data is structured within the gxp environment will depend on what the data will be used for and the end user may have this dictated to them by the software/computerized systems available. there are many types of file structure, the most common being flat files and relational databases. different file structures due to their attributes may require different controls and data review methods and may retain meta data in different ways.\n\n## 6.19. validation - for intended purpose (gmp; see also annex 11, 15)\n\ncomputerized systems should comply with regulatory requirements and associated guidance. these should be validated for their intended purpose which requires an understanding of the computerized systems function within a process. for this reason, the acceptance of vendor-supplied validation data in isolation of system configuration and users intended use is not acceptable. in isolation from the intended process or end-user it infrastructure, vendor testing is likely to be limited to functional verification only and may not fulfill the requirements for performance qualification. functional verification demonstrates that the required information is consistently and completely presented. validation for intended purpose ensures that the steps for generating the custom report accurately reflect those described in the data checking sop and that the report output is consistent with the procedural steps for performing the subsequent review.\n\n## 6.20. it suppliers and service providers (including cloud providers and virtual service/platforms)\n\nwhere cloud or virtual services are used, attention should be paid to understanding the service provided, ownership, retrieval, retention, and security of data. the physical location where the data is held, including the impact of any laws applicable to that geographic location, should be considered. the responsibilities of the contract giver and acceptor should be defined in a technical agreement or contract. this should ensure timely access to data (including metadata and audit trails) to the data owner and national competent authorities upon request. contracts with providers should define responsibilities for archiving and continued readability of the data throughout the retention period (see archive). appropriate arrangements must exist for the restoration of the software/system as per its original validated state, including validation and change control information to permit this restoration. business continuity arrangements should be included in the contract and tested. the need for an audit of the service provider should be based upon risk.\n\nmhra gxp data integrity guidance and definitions; revision 1: march 2018 page 19 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3f6410a2-84e5-4355-8686-16804d3a0830": {"__data__": {"id_": "3f6410a2-84e5-4355-8686-16804d3a0830", "embedding": null, "metadata": {"page_label": "20", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Management and Quality Assurance in Clinical Research: Best Practices and Strategies", "questions_this_excerpt_can_answer": "1. What specific practices does the acronym \"GXP\" encompass within the context of data integrity and quality assurance in clinical research, as defined by the MHRA GXP Data Integrity Guidance and Definitions (2018)?\n\n2. How does the document \"Data Management and Quality Assurance in Clinical Research: Best Practices and Strategies\" define the enhanced version of ALCOA, known as ALCOA+, in terms of ensuring data integrity in clinical research?\n\n3. What are the key components and requirements for an advanced electronic signature as outlined in the MHRA GXP Data Integrity Guidance and Definitions (2018), particularly in the context of validating the identity of the signer and the integrity of the data in clinical research documentation?", "prev_section_summary": "The section discusses the importance of understanding file structure in regulated environments, the validation of computerized systems for their intended purpose, and considerations for managing data integrity, ownership, and security when using cloud or virtual services. Key entities mentioned include different types of file structures (flat files and relational databases), the need for validation beyond vendor-supplied data, and the responsibilities and considerations for IT suppliers and service providers, including contracts and data access.", "excerpt_keywords": "Data Integrity, GXP, ALCOA+, Electronic Signature, Data Quality"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n|acronym or word or phrase|definition|\n|---|---|\n|ecrf|electronic case report form|\n|ecg|electrocardiogram|\n|gxp|good x practice where x is used as a collective term for gdp - good distribution practice, gcp - good clinical practice, glp - good laboratory practice, gmp - good manufacturing practice, gpvp - good pharmacovigilance practice|\n|data quality|the assurance that data produced is exactly what was intended to be produced and fit for its intended purpose. this incorporates alcoa|\n|alcoa|acronym referring to attributable, legible, contemporaneous, original and accurate|\n|alcoa +|acronym referring to attributable, legible, contemporaneous, original and accurate plus complete, consistent, enduring, and available|\n|dira|data integrity risk assessment|\n|terminology|the body of terms used with a particular technical application in a subject of study, profession, etc.|\n|data cleaning|the process of detecting and correcting (or removing) corrupt or inaccurate records from a record set, table, or database and refers to identifying incomplete, incorrect, inaccurate or irrelevant parts of the data and then replacing, modifying, or deleting the dirty or coarse data|\n|format|the something is arranged or set out|\n|directly accessible|at once; without delay|\n|procedures|written instructions or other documentation describing process i.e. standard operating procedures (sop)|\n|advanced electronic signatures|an electronic signature based upon cryptographic methods of originator authentication, computed by using a set of rules and a set of parameters such that the identity of the signer and the integrity of the data can be verified|\n|validated scanning process|a process whereby documents / items are scanned as a process with added controls such as location identifiers and ocr so that each page duplicated does not have to be further checked by a human|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "65d1f3ac-9e2e-47b3-a972-ddeba134a7f8": {"__data__": {"id_": "65d1f3ac-9e2e-47b3-a972-ddeba134a7f8", "embedding": null, "metadata": {"page_label": "21", "file_name": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf", "file_type": "application/pdf", "file_size": 456031, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Regulatory Guidelines for Good Practices in the European Union and MHRA GXP Data Integrity Guidance: Ensuring Compliance and Integrity in Pharmaceutical Manufacturing", "questions_this_excerpt_can_answer": "1. What are the key references cited in the MHRA GXP Data Integrity Guidance and Definitions document for ensuring compliance and integrity in pharmaceutical manufacturing as of 2018?\n\n2. What specific guidance documents and regulatory frameworks does the MHRA GXP Data Integrity Guidance and Definitions document reference for good manufacturing practice (GMP), good laboratory practice (GLP), and good clinical practice (GCP) within the European Union as of 2018?\n\n3. What is the publication month and the reason for the first revision of the MHRA GXP Data Integrity Guidance and Definitions document, and how does it contribute to understanding the evolution of data integrity guidelines in the pharmaceutical sector as of March 2018?", "prev_section_summary": "The section discusses the key topics of data integrity and quality assurance in clinical research, focusing on practices defined by the MHRA GXP Data Integrity Guidance and Definitions (2018). It covers the definition of the acronym \"GXP\" encompassing various good practices, the enhanced version of ALCOA known as ALCOA+, key components of an advanced electronic signature, data cleaning processes, terminology, and procedures for ensuring data quality and integrity. The section also highlights the importance of validated scanning processes and directly accessible data in maintaining data integrity in clinical research documentation.", "excerpt_keywords": "MHRA, GXP, Data Integrity, Pharmaceutical Manufacturing, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[11] MHRA GXP Data Integrity Guidance and Definitions (2018).pdf\n## references\n\n|computerised systems. in: the rules governing medicinal products in the european union. volume 4: good manufacturing practice (gmp) guidelines: annex 11. brussels: european commission.|http://ec.europa.eu/enterprise/pharmaceuticals/eudralex/vol-4/pdfs-en/anx11en.pdf|\n|---|---|\n|oecd series on principles of good laboratory practice (glp) and compliance monitoring. paris: organisation for economic co-operation and development.|http://www.oecd.org/chemicalsafety/testing/oecdseriesonprinciplesofgoodlaboratorypracticeglpandcompliancemonitoring.htm|\n|good clinical practice (gcp) ich e6(r2) november 2016|http://www.ich.org/products/guidelines/efficacy/article/efficacy-guidelines.html|\n|guidance on good data and record management practices; world health organisation, who technical report series, no.996, annex 5; 2016.|http://apps.who.int/medicinedocs/en/m/abstract/js22402en/|\n|good practices for data management and integrity in regulated gmp/gdp environments - pic/s; pi041-1(draft 2); august 2016.|https://picscheme.org/en/news?itemid=33|\n|mhra gmp data integrity definitions and guidance for industry. london: medicines and healthcare products regulatory agency; march 2015.|https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/412735/data_integrity_definitions_and_guidance_v2.pdf|\n|mhra/hra draft guidance on the use of electronic consent|http://www.hra-decisiontools.org.uk/consent/|\n|eu pharmacovigilance legislation: http://ec.europa.eu/health/human-use/pharmacovigilance| |\n|the human medicines regulations 2012 (statutory instrument 2012 no. 1916): http://www.legislation.gov.uk/uksi/2012/1916/contents/made| |\n|eu good pharmacovigilance practice modules: http://www.ema.europa.eu/ema/index.jsp?curl=pages/regulation/document_listing/document_listing_000345.jsp&mid=wc0b01ac058058f32c| |\n\n## revision history\n\n|revision|publication month|reason for changes|\n|---|---|---|\n|revision 1|march 2018|none. first issue.|\n\n## mhra gxp data integrity guidance and definitions; revision 1: march 2018\n\npage 21 of 21", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9f5f47bf-6582-41e6-9d64-ee03f81dfc51": {"__data__": {"id_": "9f5f47bf-6582-41e6-9d64-ee03f81dfc51", "embedding": null, "metadata": {"page_label": "1", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Pharmaceutical Inspection Cooperation and Data Management Guidelines", "questions_this_excerpt_can_answer": "1. What is the official title of the PIC/S guidance document that outlines good practices for data management and integrity in regulated GMP/GDP environments, as released on 1 July 2021?\n \n2. How does the PIC/S document PI 041-1, published in 2021, regulate the reproduction of its content for commercial and internal use, and what are the conditions for authorized internal reproduction?\n\n3. Who is the editor or the responsible entity for the PIC/S guidance document PI 041-1 on good practices for data management and integrity, and how can they be contacted for further information or inquiries?", "excerpt_keywords": "PIC/S, data management, integrity, regulated environments, GMP/GDP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## picist\n\npharmaceutical inspection convention\n\npharmaceutical inspection co-operation scheme\n\npi041-1\n\n1 july 2021\n\n## pic/s guidance\n\ngood practices for data management and integrity in regulated gmp/gdp environments\n\n(c) pic/s 2021\n\nreproduction prohibited for commercial purposes.\n\nreproduction for internal use is authorised, provided that the source is acknowledged.\n\neditor: pic/s secretariat\n\ne-mail: info@picscheme.org\n\nweb site: https://www.picscheme.org\n\npi 041-1 1of63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "920df918-9815-4156-af26-be6bb4a58af1": {"__data__": {"id_": "920df918-9815-4156-af26-be6bb4a58af1", "embedding": null, "metadata": {"page_label": "2", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Pharmaceutical Quality System and Data Integrity Management: Guidelines for Control, Compliance, and Assurance", "questions_this_excerpt_can_answer": "1. What specific guidelines does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" provide for the generation, distribution, and control of template records within pharmaceutical quality systems?\n\n2. How does the document address the validation and maintenance requirements for computerised systems in the context of ensuring data integrity and compliance within regulated pharmaceutical environments?\n\n3. What are the detailed steps or considerations outlined in the document for the disposal of original records or true copies to maintain data integrity and comply with regulatory standards in the pharmaceutical industry?", "prev_section_summary": "The section provides an excerpt from the Pharmaceutical Inspection Cooperation and Data Management Guidelines document released by PIC/S on 1 July 2021. It outlines good practices for data management and integrity in regulated GMP/GDP environments. The document prohibits reproduction for commercial purposes but allows for internal use with proper acknowledgment of the source. The editor of the document is the PIC/S Secretariat, who can be contacted via email at info@picscheme.org or through their website. Key topics include data management, integrity, and regulatory compliance in pharmaceutical environments. Key entities mentioned are PIC/S and the PIC/S Secretariat.", "excerpt_keywords": "Pharmaceutical Quality System, Data Integrity Management, Regulatory Compliance, Template Records, Computerised Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n|content|page number|\n|---|---|\n|document history|3|\n|introduction|3|\n|purpose|4|\n|scope|5|\n|data governance system|5|\n|what is data governance?|5|\n|data governance systems|6|\n|risk management approach to data governance|7|\n|data criticality|8|\n|data risk|8|\n|data governance system review|9|\n|organisational influences on successful data integrity management|10|\n|general|10|\n|policies related to organisational values, quality, staff conduct and ethics|11|\n|quality culture|12|\n|modernising the pharmaceutical quality system|13|\n|regular management review of performance indicators (including quality metrics)|14|\n|resource allocation|14|\n|dealing with data integrity issues found internally|15|\n|general data integrity principles and enablers|15|\n|specific data integrity considerations for paper-based systems|20|\n|structure of pharmaceutical quality system and control of blank forms/templates/records|20|\n|importance of controlling records|21|\n|generation, distribution and control of template records|21|\n|expectations for the generation, distribution and control of records|21|\n|use and control of records located at the point-of-use|24|\n|filling out records|25|\n|making corrections on records|26|\n|verification of records (secondary checks)|27|\n|direct print-outs from electronic systems|29|\n|document retention (identifying record retention requirements and archiving records)|29|\n|disposal of original records or true copies|30|\n|specific data integrity considerations for computerised systems|31|\n|structure of the pharmaceutical quality system and control of computerised systems|31|\n|qualification and validation of computerised systems|32|\n|validation and maintenance|33|\n|data transfer|38|\n|system security for computerised systems|40|\n|audit trails for computerised systems|45|\n|data capture/entry for computerised systems|47|\n|review of data within computerised systems|49|\n|storage, archival and disposal of electronic data|50|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f1b1dfe8-0efc-4cd2-ae19-568ff7edcd82": {"__data__": {"id_": "f1b1dfe8-0efc-4cd2-ae19-568ff7edcd82", "embedding": null, "metadata": {"page_label": "3", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Management in Pharmaceutical Supply Chains: Ensuring Accuracy and Security", "questions_this_excerpt_can_answer": "1. What are the key principles of good data management practices as outlined in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, and how do they contribute to the integrity of data in the pharmaceutical supply chain?\n\n2. How does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document address the management of hybrid systems and the specific considerations for ensuring data integrity within outsourced activities in the pharmaceutical industry?\n\n3. What strategies and actions does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend for the remediation of data integrity failures, including the classification of deficiencies and indicators of improvement, within the context of pharmaceutical manufacturing and distribution?", "prev_section_summary": "The section covers key topics related to data governance, risk management, organizational influences on data integrity management, general data integrity principles, specific considerations for paper-based and computerized systems, validation and maintenance requirements for computerized systems, data transfer, system security, audit trails, data capture/entry, review of data, and storage/archival/disposal of electronic data. Entities discussed include data governance systems, data criticality, data risk, quality culture, resource allocation, record retention, template records, corrections on records, verification of records, document retention, disposal of records, qualification and validation of computerized systems, system security, audit trails, and data capture/entry for computerized systems.", "excerpt_keywords": "Data Integrity, Pharmaceutical Supply Chains, Good Practices, Regulatory Environments, Data Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## management of hybrid systems\n\n....................................................................................52\n\n## data integrity considerations for outsourced activities\n\ngeneral supply chain considerations: 54\n\n........................................................................................54\n\nstrategies for assessing data integrity in the supply chain: 54\n\n## regulatory actions in response to data integrity findings\n\ndeficiency references: 56\n\nclassification of deficiencies: 57\n\n## remediation of data integrity failures\n\nresponding to significant data integrity issues: 59\n\nindicators of improvement: 60\n\n## glossary\n\n.........................................................................................................................61\n\n## revision history\n\nadoption by committee of pi 041-1: 1 june 2021\n\nentry into force of pi 041-1: 1 july 2021\n\n## introduction\n\npic/s participating authorities regularly undertake inspections of manufacturers and distributors of active pharmaceutical ingredient (api) and medicinal products in order to determine the level of compliance with good manufacturing practice (gmp) and good distribution practice (gdp) principles. these inspections are commonly performed on-site however may be performed through the remote or off-site evaluation of documentary evidence, in which case the limitations of remote review of data should be considered.\n\nthe effectiveness of these inspection processes is determined by the reliability of the evidence provided to the inspector and ultimately the integrity of the underlying data. it is critical to the inspection process that inspectors can determine and fully rely on the accuracy and completeness of evidence and records presented to them.\n\ndata management refers to all those activities performed during the handling of data including but not limited to data policy, documentation, quality and security. good data management practices influence the quality of all data generated and recorded by a manufacturer. these practices should ensure that data is attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available. while the main focus of this document is in relation to gmp/gdp expectations, the principles herein should also be considered in the wider context of good data management such as data included in the registration dossier based on which api and drug product control strategies and specifications are set.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8ffd9df1-3229-49ea-bb92-de671b651bfd": {"__data__": {"id_": "8ffd9df1-3229-49ea-bb92-de671b651bfd", "embedding": null, "metadata": {"page_label": "4", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Best Practices for Data Management and Data Integrity in Pharmaceutical Quality Systems", "questions_this_excerpt_can_answer": "1. What is the definition of data integrity according to the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, and why is it considered a fundamental requirement for an effective pharmaceutical quality system?\n\n2. How does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document propose to ensure that inspectorates can effectively interpret GMP/GDP requirements in relation to good data management and conduct of inspections?\n\n3. What specific guides or sections does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document reference to support the implementation of risk-based control strategies for ensuring data validity, completeness, and reliability within the pharmaceutical industry?", "prev_section_summary": "The section discusses the management of hybrid systems, data integrity considerations for outsourced activities, regulatory actions in response to data integrity findings, and remediation of data integrity failures in the pharmaceutical supply chain. It emphasizes the importance of good data management practices to ensure the accuracy and security of data in compliance with good manufacturing practice (GMP) and good distribution practice (GDP) principles. The document also provides strategies for assessing data integrity in the supply chain, responding to data integrity issues, and indicators of improvement. Additionally, it includes a glossary and revision history for reference.", "excerpt_keywords": "Data Integrity, Pharmaceutical Quality System, Good Data Management Practices, Inspectorates, Risk-Based Control Strategies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### 2.4 good data management practices\n\napply to all elements of the pharmaceutical quality system and the principles herein apply equally to data generated by electronic and paper-based systems.\n\n### 2.5 data integrity\n\nis defined as \"the degree to which data are complete, consistent, accurate, trustworthy, and reliable and that these characteristics of the data are maintained throughout the data life cycle\". this is a fundamental requirement for an effective pharmaceutical quality system which ensures that medicines are of the required quality. poor data integrity practices and vulnerabilities undermine the quality of records and evidence, and may ultimately undermine the quality of medicinal products.\n\n### 2.6 responsibility\n\nfor good practices regarding data management and integrity lies with the manufacturer or distributor undergoing inspection. they have full responsibility and a duty to assess their data management systems for potential vulnerabilities and take steps to design and implement good data governance practices to ensure data integrity is maintained.\n\n## 3 purpose\n\n### 3.1\n\nthis document was written with the aim of:\n\n### 3.1.1 providing guidance for inspectorates\n\nin the interpretation of gmp/gdp requirements in relation to good data management and the conduct of inspections.\n\n### 3.1.2 providing consolidated, illustrative guidance\n\non risk-based control strategies which enable the existing requirements for data to be valid, complete and reliable as described in pic/s guides for gmp and gdp to be implemented in the context of modern industry practices and globalised supply chains.\n\n### 3.1.3 facilitating the effective implementation\n\nof good data management elements into the routine planning and conduct of gmp/gdp inspections; to provide a tool to harmonise gmp/gdp inspections and to ensure the quality of inspections with regards to data integrity expectations.\n\n### 3.2\n\nthis guidance, together with inspectorate resources such as aide memoire, should enable the inspector to make an optimal use of the inspection time and an optimal evaluation of data integrity elements during an inspection.\n\n### 3.3 guidance herein should assist the inspectorate\n\nin planning a risk-based inspection relating to good data management practices.\n\n### 3.4 good data management\n\nhas always been considered an integral part of gmp/gdp. hence, this guide is not intended to impose additional regulatory burden upon regulated entities, rather it is intended to provide guidance on the interpretation of existing gmp/gdp requirements relating to current industry data management practices.\n\n1 gxp data integrity guidance and definitions, mhra, march 2018\n\n2 pic/s pe 009 guide to good manufacturing practice for medicinal products, specifically part i chapters 4, 5, 6, part ii chapters 5, 6 & annex 11\n\n3 pic/s pe 011 guide to good distribution practice for medicinal products, specifically sections 3, 4, 5 & 6\n\npi 041-1 4 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1fbc8590-0f0a-460f-aa97-8da0b031e3d9": {"__data__": {"id_": "1fbc8590-0f0a-460f-aa97-8da0b031e3d9", "embedding": null, "metadata": {"page_label": "5", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Best Practices for Data Management, Integrity, and Governance in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. How does the document define the term \"data governance\" within the context of data management and integrity in regulated pharmaceutical environments?\n\n2. What is the scope of the guidance provided in the document regarding on-site and remote inspections for manufacturing (GMP) and distribution (GDP) activities in the pharmaceutical industry?\n\n3. How does the document address the applicability of its data management and integrity principles to paper-based, computerized, and hybrid systems in relation to adopting new technologies and concepts in accordance with ICH Q10 principles?", "prev_section_summary": "The section discusses the importance of good data management practices and data integrity in pharmaceutical quality systems. It defines data integrity as the completeness, consistency, accuracy, trustworthiness, and reliability of data throughout its lifecycle. The responsibility for ensuring good data management practices lies with manufacturers and distributors undergoing inspection. The document aims to provide guidance for inspectorates on interpreting GMP/GDP requirements related to data management and conducting inspections. It also offers guidance on risk-based control strategies to ensure data validity, completeness, and reliability in the pharmaceutical industry. The section emphasizes the importance of incorporating good data management practices into routine inspections and harmonizing inspection processes to meet data integrity expectations. It references specific guides and sections to support the implementation of these practices within the pharmaceutical industry.", "excerpt_keywords": "data management, data integrity, pharmaceutical industry, regulatory environments, data governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### 3.5\n\nthe principles of data management and integrity apply equally to paper-based, computerised and hybrid systems and should not place any restraint upon the development or adoption of new concepts or technologies. in accordance with ich q10 principles, this guide should facilitate the adoption of innovative technologies through continual improvement.\n\n### 3.6\n\nthe term \"pharmaceutical quality system\" is predominantly used throughout this document to denote the quality management system used to manage and achieve quality objectives. while the term \"pharmaceutical quality system\" is used predominantly by gmp regulated entities, for the purposes of this guidance, it should be regarded as interchangeable with the term \"quality system\" used by gdp regulated entities.\n\n### 3.7\n\nthis guide is not mandatory or enforceable under law. it is not intended to be restrictive or to replace national legislation regarding data integrity requirements for manufacturers and distributors of medicinal products and active substances (i.e. active pharmaceutical ingredients). data integrity deficiencies should be referenced to national legislation or relevant paragraphs of the pic/s gmp or gdp guidance.\n\n### 4 scope\n\n### 4.1\n\nthe guidance has been written to apply to on-site inspections of those sites performing manufacturing (gmp) and distribution (gdp) activities. the principles within this guide are applicable for all stages throughout the product lifecycle. the guide should be considered as a non-exhaustive list of areas to be considered during inspection.\n\n### 4.2\n\nthe guidance also applies to remote (desktop) inspections of sites performing manufacturing (gmp) and distribution (gdp) activities, although this will be limited to an assessment of data governance systems. on-site assessment is normally required for data verification and evidence of operational compliance with procedures.\n\n### 4.3\n\nwhilst this document has been written with the above scope, many principles regarding good data management practices described herein have applications for other areas of the regulated pharmaceutical and healthcare industry.\n\n### 4.4\n\nthis guide is not intended to provide specific guidance for \"for-cause\" inspections following detection of significant data integrity vulnerabilities where forensic expertise may be required.\n\n### 5 data governance system\n\n### 5.1 what is data governance?\n\n### 5.1.1\n\ndata governance is the sum total of arrangements which provide assurance of data integrity. these arrangements ensure that data, irrespective of the process, format or technology in which it is generated, recorded, processed, retained, retrieved and used will ensure an attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, and available record throughout the data lifecycle. while there may be no", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a94cdf23-364a-4d67-9d28-2d67c94131a3": {"__data__": {"id_": "a94cdf23-364a-4d67-9d28-2d67c94131a3", "embedding": null, "metadata": {"page_label": "6", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Implementation of Data Governance System in Pharmaceutical Quality Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the implementation of a data governance system within a pharmaceutical quality system, as described in PIC/S GMP/GDP guidelines, contribute to managing data integrity risks throughout the data lifecycle?\n\n2. What specific organizational and technical controls does the document recommend for ensuring data integrity and compliance with quality risk management principles within the pharmaceutical industry?\n\n3. How does the document suggest regulated entities should approach the design, development, operation, and monitoring of data governance systems to effectively control data management and integrity, considering the complexity of systems and operations?", "prev_section_summary": "The section discusses the principles of data management and integrity in regulated pharmaceutical environments, emphasizing that these principles apply to paper-based, computerized, and hybrid systems without restricting the adoption of new technologies. It also explains the concept of a pharmaceutical quality system and clarifies that the guidance provided is not mandatory but serves as a reference for national legislation. The scope of the guidance covers on-site and remote inspections for manufacturing and distribution activities in the pharmaceutical industry, with a focus on data governance systems to ensure data integrity. The section defines data governance as arrangements that provide assurance of data integrity throughout the data lifecycle, emphasizing the importance of maintaining data quality and reliability.", "excerpt_keywords": "Data governance system, Pharmaceutical quality system, Data integrity, Quality risk management, Data lifecycle"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## legislative requirement to implement a data governance system\n\nits establishment enables the manufacturer to define, prioritise and communicate their data integrity risk management activities in a coherent manner. absence of a data governance system may indicate uncoordinated data integrity systems, with potential for gaps in control measures.\n\n## 5.1.2 the data lifecycle\n\nrefers to how data is generated, processed, reported, checked, used for decision-making, stored and finally discarded at the end of the retention period. data relating to a product or process may cross various boundaries within the lifecycle. this may include data transfer between paper-based and computerised systems, or between different organizational boundaries; both internal (e.g. between production, qc and qa) and external (e.g. between service providers or contract givers and acceptors).\n\n## 5.2 data governance systems\n\n### 5.2.1 data governance systems should be integral to the pharmaceutical quality system described in pic/s gmp/gdp. it should address data ownership throughout the lifecycle, and consider the design, operation and monitoring of processes and systems in order to comply with the principles of data integrity, including control over intentional and unintentional changes to, and deletion of information.\n\n### 5.2.2 data governance systems rely on the incorporation of suitably designed systems, the use of technologies and data security measures, combined with specific expertise to ensure that data management and integrity is effectively controlled. regulated entities should take steps to ensure appropriate resources are available and applied in the design, development, operation and monitoring of the data governance systems, commensurate with the complexity of systems, operations, and data criticality and risk.\n\n### 5.2.3 the data governance system should ensure controls over the data lifecycle which are commensurate with the principles of quality risk management. these controls may be:\n\n|organisational|technical|\n|---|---|\n|- procedures, e.g. instructions for completion of records and retention of completed records;\n- training of staff and documented authorization for data generation and approval;\n- data governance system design, considering how data is generated, recorded, processed, retained and used, and risks or vulnerabilities are controlled effectively;\n- routine (e.g. daily, batch- or activity-related) data verification;\n- periodic surveillance, e.g. self-inspection processes seek to verify the effectiveness of the data governance system; or\n- the use of personnel with expertise in data management and integrity, including expertise in data security measures.\n|- computerised system validation, qualification and control;\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "44ad0e88-06b0-4b5f-a6eb-cabc45a62b28": {"__data__": {"id_": "44ad0e88-06b0-4b5f-a6eb-cabc45a62b28", "embedding": null, "metadata": {"page_label": "7", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Risk Management and Data Governance in Pharmaceutical Quality Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the document describe the role of senior management in establishing an effective data governance system within pharmaceutical quality systems?\n \n2. What principles are recommended for assessing the frequency of reviews for contract acceptors' data management policies and control strategies as part of a vendor assurance program in regulated environments?\n\n3. According to the document, what factors should determine the effort and resources allocated to data governance in entities regulated in accordance with GMP/GDP principles?", "prev_section_summary": "This section discusses the importance of implementing a data governance system in pharmaceutical quality systems to manage data integrity risks throughout the data lifecycle. It highlights the legislative requirement for a data governance system, the data lifecycle stages, and the key components of data governance systems such as data ownership, design, operation, and monitoring processes. The section also emphasizes the need for organizational and technical controls to ensure data integrity and compliance with quality risk management principles within the pharmaceutical industry. Key topics include procedures, training, data verification, surveillance, computerized system validation, and personnel expertise in data management and integrity. Regulatory entities are advised to allocate appropriate resources for the design, development, operation, and monitoring of data governance systems based on the complexity of systems, operations, and data criticality and risk.", "excerpt_keywords": "Keywords: Data Management, Data Integrity, Pharmaceutical Quality Systems, Risk Management, Data Governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 5.2.4\n\nan effective data governance system will demonstrate senior managements understanding and commitment to effective data governance practices including the necessity for a combination of appropriate organisational culture and behaviors (section 6) and an understanding of data criticality, data risk and data lifecycle. there should also be evidence of communication of expectations to personnel at all levels within the organization in a manner which ensures empowerment to report failures and opportunities for improvement. this reduces the incentive to falsify, alter or delete data.\n\n## 5.2.5\n\nthe organizations arrangements for data governance should be documented within their pharmaceutical quality system and regularly reviewed.\n\n## 5.3 risk management approach to data governance\n\n## 5.3.1\n\nsenior management is responsible for the implementation of systems and procedures to minimize the potential risk to data integrity, and for identifying the residual risk, using the principles of ich q9. contract givers should perform a review of the contract acceptors data management policies and control strategies as part of their vendor assurance program. the frequency of such reviews should be based on the criticality of the services provided by the contract acceptor, using risk management principles (refer to section 10).\n\n## 5.3.2\n\nthe effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality resource demands. all entities regulated in accordance with gmp/gdp principles (including manufacturers, analytical laboratories, importers and wholesale distributors) should design and operate a system which provides an acceptable state of control based on the data quality risk, and which is documented with supporting rationale.\n\n## 5.3.3\n\nwhere long term measures are identified in order to achieve the desired state of control, interim measures should be implemented to mitigate risk, and should be monitored for effectiveness. where interim measures or risk prioritization are required, residual data integrity risk should be communicated to senior management, and kept under review. reverting from automated and computerized systems to paper-based systems will not remove the need for data governance. such retrograde approaches are likely to increase administrative burden and data risk, and prevent the continuous improvement initiatives referred to in paragraph 3.5.\n\n## 5.3.4\n\nnot all data or processing steps have the same importance to product quality and patient safety. risk management should be utilized to determine the importance of each data/processing step. an effective risk management approach to data governance will consider:\n\n- data criticality (impact to decision making and product quality)\n- data risk (opportunity for data alteration and deletion, and likelihood of detection/visibility of changes by the manufacturers routine review processes)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "14a02b9a-d956-4a38-8322-167e5423e9fb": {"__data__": {"id_": "14a02b9a-d956-4a38-8322-167e5423e9fb", "embedding": null, "metadata": {"page_label": "8", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Criticality and Risk Assessment in Data Governance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the document define the relationship between data criticality and the decision-making process in a regulated environment, particularly in relation to batch release decisions and the impact of different types of data on product quality and safety?\n\n2. What specific factors does the document suggest should be considered in a data risk assessment within a GMP/GDP environment to ensure data integrity and prevent data vulnerabilities such as alteration, deletion, or falsification?\n\n3. According to the document, how should organizations prioritize their data governance efforts based on the assessment of data criticality and risk, and what are the recommended practices for documenting the rationale behind this prioritization in accordance with quality risk management principles?", "prev_section_summary": "This section focuses on the importance of establishing an effective data governance system within pharmaceutical quality systems, with a specific emphasis on the role of senior management in understanding and committing to data governance practices. It highlights the need for appropriate organizational culture, communication of expectations to personnel, and the documentation of data governance arrangements within the pharmaceutical quality system. The section also discusses the risk management approach to data governance, emphasizing the responsibility of senior management in minimizing potential risks to data integrity and the importance of assessing the frequency of reviews for contract acceptors' data management policies. Additionally, it stresses the need for allocating resources to data governance based on the risk to product quality and the importance of implementing interim measures to mitigate risks. The section concludes by emphasizing the importance of utilizing risk management to determine the importance of each data/processing step in ensuring product quality and patient safety.", "excerpt_keywords": "Data Criticality, Risk Assessment, Data Governance, GMP, GDP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n#### from this information, risk proportionate control measures can be implemented. subsequent sections of this guidance that refer to a risk management approach refer to risk as a combination of data risk and data criticality concepts.\n\n#### data criticality\n\n|5.4.1|the decision that data influences may differ in importance and the impact of the data to a decision may also vary. points to consider regarding data criticality include:|\n|---|---|\n| |- which decision does the data influence? for example: when making a batch release decision, data which determines compliance with critical quality attributes is normally of greater importance than warehouse cleaning records.|\n| |- what is the impact of the data to product quality or safety? for example: for an oral tablet, api assay data is of generally greater impact to product quality and safety than tablet friability data.|\n\n#### data risk\n\n|5.5.1|whereas data integrity requirements relate to all gmp/gdp data, the assessment of data criticality will help organisations to prioritise their data governance efforts. the rationale for this prioritisation should be documented in accordance with quality risk management principles.|\n|---|---|\n|5.5.2|data risk assessments should consider the vulnerability of data to involuntary alteration, deletion, loss (either accidental or by security failure) or re-creation or deliberate falsification, and the likelihood of detection of such actions. consideration should also be given to ensuring complete and timely data recovery in the event of a disaster. control measures which prevent unauthorised activity, and increase visibility / detectability can be used as risk mitigating actions.|\n|5.5.3|examples of factors which can increase risk of data failure include processes that are complex, or inconsistent, with open ended and subjective outcomes. simple processes with tasks which are consistent, well defined and objective lead to reduced risk.|\n|5.5.4|risk assessments should focus on a business process (e.g. production, qc), evaluate data flows and the methods of generating and processing data, and not just consider information technology (it) system functionality or complexity. factors to consider include:|\n| |- process complexity (e.g. multi-stage processes, data transfer between processes or systems, complex data processing);|\n| |- methods of generating, processing, storing and archiving data and the ability to assure data quality and integrity;|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "195af384-c7ae-42e3-b353-95430498e04d": {"__data__": {"id_": "195af384-c7ae-42e3-b353-95430498e04d", "embedding": null, "metadata": {"page_label": "9", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity and Governance Review in Computerized Systems: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. How does the variability of biological production processes or analytical tests compared to small molecule chemistry impact process consistency in regulated environments, according to the PI 041-1 guidelines from 2021?\n\n2. What specific considerations should be made regarding manual interfaces with IT systems in the risk assessment process for computerized systems to ensure data integrity, as outlined in section 9 of the PI 041-1 document?\n\n3. According to the 2021 PI 041-1 guidelines, what are the recommended practices for conducting self-inspections or internal audits to assess the effectiveness of data integrity control measures throughout the data lifecycle?", "prev_section_summary": "This section discusses the concepts of data criticality and data risk assessment in a regulated environment. It highlights the importance of considering the impact of data on decision-making processes, particularly in relation to batch release decisions and product quality and safety. The document suggests factors to be considered in a data risk assessment to ensure data integrity and prevent vulnerabilities such as alteration, deletion, or falsification. It emphasizes the need for organizations to prioritize their data governance efforts based on the assessment of data criticality and risk, and recommends documenting the rationale behind this prioritization in accordance with quality risk management principles. The section also provides examples of factors that can increase the risk of data failure and outlines control measures that can be implemented to mitigate risks.", "excerpt_keywords": "Data Integrity, Governance, Computerized Systems, Risk Assessment, Self-Inspection"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n##### process consistency\n\n(e.g. biological production processes or analytical tests may exhibit a higher degree of variability compared to small molecule chemistry);\n\n##### degree of automation / human interaction;\n\n##### subjectivity of outcome / result\n\n(i.e. is the process open-ended vs well defined);\n\n##### outcomes of a comparison between electronic system data and manually recorded events\n\n(e.g. apparent discrepancies between analytical reports and raw-data acquisition times); and\n\n##### inherent data integrity controls incorporated into the system or software.\n\n##### for computerised systems, manual interfaces with it systems should be considered in the risk assessment process.\n\ncomputerised system validation in isolation may not result in low data integrity risk, in particular, if the user is able to influence the reporting of data from the validated system, and system validation does not address the basic requirements outlined in section 9 of this document. a fully automated and validated process together with a configuration that does not allow human intervention, or reduces human intervention to a minimum, is preferable as this design lowers the data integrity risk. appropriate procedural controls should be installed and verified where integrated controls are not possible for technical reasons.\n\n##### critical thinking skills should be used by inspectors to determine whether control and review procedures effectively achieve their desired outcomes.\n\nan indicator of data governance maturity is an organizational understanding and acceptance of residual risk, which prioritizes actions. an organization which believes that there is no risk of data integrity failure is unlikely to have made an adequate assessment of inherent risks in the data lifecycle. the approach to assessment of data lifecycle, criticality and risk should therefore be examined in detail. this may indicate potential failure modes which can be investigated during an inspection.\n\n#### data governance system review\n\n##### the effectiveness of data integrity control measures should be assessed periodically as part of self-inspection (internal audit) or other periodic review processes. this should ensure that controls over the data lifecycle are operating as intended.\n\n##### in addition to routine data verification checks (e.g. daily, batch- or activity-related), self-inspection activities should be extended to a wider review of control measures, including:\n\n- a check of continued personnel understanding of good data management practice in the context of protecting of the patient, and ensuring the maintenance of a working environment which is focused on quality and open reporting of issues (e.g. by review of continued training in good data management principles and expectations).\n- a review for consistency of reported data/outcomes against raw entries. this may review data not included during the routine data verification", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "44c8aeb1-1039-42ed-9483-1573b3927fde": {"__data__": {"id_": "44c8aeb1-1039-42ed-9483-1573b3927fde", "embedding": null, "metadata": {"page_label": "10", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Organisational Influences on Data Integrity Management and Reporting: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. How does the document describe the impact of organizational culture on the effectiveness of data integrity management and reporting practices within regulated environments?\n \n2. What specific strategies does the document recommend for managing data integrity in organizations with open versus closed cultural attitudes towards hierarchy and reporting failures?\n\n3. How does the document propose inspectors should approach and report data integrity concerns that are influenced by organizational culture and behavior, particularly when direct reporting of inspection deficiencies related to organizational behavior may not be appropriate or possible?", "prev_section_summary": "The section discusses the importance of process consistency in regulated environments, particularly in biological production processes and analytical tests. It emphasizes the need for automation and minimal human interaction to reduce data integrity risks in computerized systems. The document outlines specific considerations for manual interfaces with IT systems in the risk assessment process. It also highlights the importance of periodic self-inspections or internal audits to assess the effectiveness of data integrity control measures throughout the data lifecycle. The section emphasizes the need for critical thinking skills in determining the effectiveness of control and review procedures, as well as the importance of organizational understanding and acceptance of residual risk in prioritizing actions.", "excerpt_keywords": "Data integrity, Organizational culture, Regulatory environments, Inspection deficiencies, Quality culture"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## checks\n\n(where justified based on risk), and/or a sample of previously verified data to ensure the continued effectiveness of the routine process.\n\n## 6 organisational influences on successful data integrity management\n\n### 6.1 general\n\n6.1.1 it may not be appropriate or possible to report an inspection deficiency relating to organisational behaviour. an understanding of how behaviour influences (i) the incentive to amend, delete or falsify data and (ii) the effectiveness of procedural controls designed to ensure data integrity, can provide the inspector with useful indicators of risk which can be investigated further.\n\n6.1.2 inspectors should be sensitive to the influence of culture on organisational behaviour, and apply the principles described in this section of the guidance in an appropriate way. an effective quality culture and data governance may be different in its implementation from one location to another. however, where it is apparent that cultural approaches have led to data integrity concerns; these concerns should be effectively and objectively reported by the inspector to the organisation for rectification.\n\n6.1.3 depending on culture, an organisations control measures may be:\n\n- open (where hierarchy can be challenged by subordinates, and full reporting of a systemic or individual failure is a business expectation)\n- closed (where reporting failure or challenging a hierarchy is culturally more difficult)\n\n6.1.4 good data governance in open cultures may be facilitated by employee empowerment to identify and report issues through the pharmaceutical quality system. in closed cultures, a greater emphasis on oversight and secondary review may be required to achieve an equivalent level of control due to the social barrier of communicating undesirable information. the availability of a confidential escalation process to senior management may also be of greater importance in this situation, and these arrangements\n\n4 an exception report is a validated search tool that identifies and documents predetermined abnormal data or actions, which requires further attention or investigation by the data reviewer.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "335ec4e4-f8dc-4d09-b9ad-58d0bbe2027b": {"__data__": {"id_": "335ec4e4-f8dc-4d09-b9ad-58d0bbe2027b", "embedding": null, "metadata": {"page_label": "11", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Management and Organisational Culture: Ensuring Quality and Compliance in the Workplace", "questions_this_excerpt_can_answer": "1. How does the document PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021) define the role of management in preventing and detecting data integrity lapses within an organization?\n\n2. What specific strategies does the document suggest for addressing both direct and indirect influences on employee behavior to mitigate risks associated with data integrity, according to sections 5.3 to 5.5?\n\n3. How does PI 041-1 outline the process for establishing controls to assure data integrity, including the prevention, detection, assessment, and correction of data integrity breaches, as detailed in sections 6.2 to 6.7?", "prev_section_summary": "The section discusses the organizational influences on successful data integrity management within regulated environments. It highlights the impact of organizational culture on data integrity practices, the importance of understanding how behavior influences data integrity, and the need for inspectors to be sensitive to cultural influences. The document recommends specific strategies for managing data integrity in organizations with open versus closed cultural attitudes towards hierarchy and reporting failures. It also emphasizes the importance of effective quality culture and data governance, and the different control measures that may be necessary based on organizational culture. Additionally, it mentions the use of exception reports as a tool for identifying abnormal data or actions that require further investigation.", "excerpt_keywords": "Data integrity, Management, Organisational culture, Quality, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n###### 6.1.5\n\nthe extent of managements knowledge and understanding of data integrity can influence the organisations success of data integrity management. management should know their legal and moral obligation (i.e. duty and power) to prevent data integrity lapses from occurring and to detect them, if they should occur. management should have sufficient visibility and understanding of data integrity risks for paper and computerised (both hybrid and electronic) workflows.\n\n###### 6.1.6\n\nlapses in data integrity are not limited to fraud or falsification; they can be unintentional and still pose risk. any potential for compromising the reliability of data is a risk that should be identified and understood in order for appropriate controls to be put in place (refer sections 5.3 - 5.5). direct controls usually take the form of written policies and procedures, but indirect influences on employee behaviour (such as undue pressure, incentives for productivity in excess of process capability, opportunities for compromising data and employee rationalisation of negative behaviours) should be understood and addressed as well.\n\n###### 6.1.7\n\ndata integrity breaches can occur at any time, by any employee, so management needs to be vigilant in detecting issues and understand reasons behind lapses, when found, to enable investigation of the issue and implementation of corrective and preventive actions.\n\n###### 6.1.8\n\nthere are consequences of data integrity lapses that affect the various stakeholders (patients, regulators, customers) including directly impacting patient safety and undermining confidence in the organisation and its products. employee awareness and understanding of these consequences can be helpful in fostering an environment in which quality is a priority.\n\n###### 6.1.9\n\nmanagement should establish controls to prevent, detect, assess and correct data integrity breaches, as well as verify those controls are performing as intended to assure data integrity. sections 6.2 to 6.7 outline the key items that management should address to achieve success with data integrity.\n\n###### 6.1.10\n\nsenior management should have an appropriate level of understanding and commitment to effective data governance practices including the necessity for a combination of appropriate organisational culture and behaviors (section 6) and an understanding of data criticality, data risk and data lifecycle. there should also be evidence of communication of expectations to personnel at all levels within the organisation in a manner which ensures empowerment to report failures and opportunities for improvement. this reduces the incentive to falsify, alter or delete data.\n\n###### 6.2\n\npolicies related to organisational values, quality, staff conduct and ethics\n\n###### 6.2.1\n\nappropriate expectations for staff conduct, commitment to quality, organisational values and ethics should clearly communicated throughout the organisation and policies should be available to support the implementation and maintenance of an appropriate quality culture. policies should reflect managements philosophy on quality, and should be written with the intent of developing an environment of trust, where all individuals are responsible and accountable for ensuring patient safety and product quality.\n\npi 041-1\n11 of 63\n1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e4de2752-21f0-4fe6-9995-6db55a5ebff7": {"__data__": {"id_": "e4de2752-21f0-4fe6-9995-6db55a5ebff7", "embedding": null, "metadata": {"page_label": "12", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Promoting Data Quality, Integrity, and a Quality Culture in the Workplace: Strategies and Best Practices", "questions_this_excerpt_can_answer": "1. How does the document suggest management should address unwanted behaviors that compromise data quality and integrity within a regulated environment, and what considerations should be taken when implementing disciplinary actions to ensure they do not hinder further investigation into data integrity issues?\n\n2. What specific strategies does the document recommend for fostering a quality culture within the workplace that promotes transparency, open communication of failures and mistakes, and the free flow of information between personnel at all levels?\n\n3. According to the document, what role does management play in ensuring personnel are aware of the importance of their role in data quality and integrity, and how should policies related to ethical behavior and the consequences of failing to meet these expectations be communicated to all personnel?", "prev_section_summary": "The section discusses the importance of management's role in preventing and detecting data integrity lapses within an organization. It emphasizes the need for management to have knowledge and understanding of data integrity risks, establish controls to assure data integrity, and address both direct and indirect influences on employee behavior. The consequences of data integrity breaches on stakeholders, the necessity for effective data governance practices, and the establishment of policies related to organizational values, quality, staff conduct, and ethics are also highlighted. Overall, the section underscores the significance of creating a culture where quality is a priority and where all individuals are responsible for ensuring patient safety and product quality.", "excerpt_keywords": "Data Management, Data Integrity, Quality Culture, Ethical Behavior, Organizational Values"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 6.2 data quality and integrity\n\n6.2.2 management should make personnel aware of the importance of their role in ensuring data quality and the implication of their activities to assuring product quality and protecting patient safety.\n\n6.2.3 policies should clearly define the expectation of ethical behaviour, such as honesty. this should be communicated to and be well understood by all personnel. the communication should not be limited only to knowing the requirements, but also why they were established and the consequences of failing to fulfill the requirements.\n\n6.2.4 unwanted behaviors, such as deliberate data falsification, unauthorized changes, destruction of data, or other conduct that compromises data quality should be addressed promptly. examples of unwanted behaviors and attitudes should be documented in the company policies. actions to be taken in response to unwanted behaviors should be documented. however, care should be taken to ensure that actions taken (such as disciplinary actions) do not impede any subsequent investigation into the data integrity issues identified, e.g. severe retribution may prevent other staff members from disclosing information of value to the investigation.\n\n6.2.5 the display of behaviors that conform to good practices for data management and integrity should be actively encouraged and recognized appropriately.\n\n6.2.6 there should be a confidential escalation program supported by company policies and procedures whereby it encourages personnel to bring instances of possible breaches of policies to the attention of senior management without consequence for the informer/employee. the potential for breaches of the policies by senior management should be recognized and a suitable reporting mechanism for those cases should be available.\n\n6.2.7 where possible, management should implement systems with controls that by default, uphold the intent and requirements of company policies.\n\n## 6.3 quality culture\n\n6.3.1 management should aim to create a work environment (i.e. quality culture) that is transparent and open, one in which personnel are encouraged to freely communicate failures and mistakes, including potential data reliability issues, so that corrective and preventive actions can be taken. organizational reporting structure should permit the information flow between personnel at all levels.\n\n6.3.2 it is the collection of values, beliefs, thinking, and behaviors demonstrated consistently by management, team leaders, quality personnel, and all personnel that contribute to creating a quality culture to assure data quality and integrity.\n\n6.3.3 management can foster quality culture by:\n\n- ensuring awareness and understanding of expectations (e.g. code of values and ethics and code of conduct),\n- leading by example, management should demonstrate the behaviors they expect to see,\n- being accountable for actions and decisions, particularly delegated activities.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c6f7dcae-738c-42f2-8a33-5305ad0e1d4a": {"__data__": {"id_": "c6f7dcae-738c-42f2-8a33-5305ad0e1d4a", "embedding": null, "metadata": {"page_label": "13", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Modernizing Pharmaceutical Quality Systems to Ensure Data Integrity and Enhance Overall Quality\"", "questions_this_excerpt_can_answer": "1. What specific strategies does the document suggest for fostering a positive culture towards ensuring data integrity within a pharmaceutical organization?\n \n2. How does the document propose modernizing the pharmaceutical quality system to address the challenges associated with managing complex data?\n\n3. What are the key areas identified by the document for implementing control and procedural changes to prevent, detect, and correct weaknesses that may lead to data integrity lapses in the pharmaceutical industry?", "prev_section_summary": "The section discusses the importance of data quality and integrity in regulated environments, emphasizing the role of management in promoting a quality culture within the workplace. Key topics include making personnel aware of their role in ensuring data quality, addressing unwanted behaviors that compromise data integrity, fostering transparency and open communication, and implementing systems and controls to uphold company policies. Entities mentioned include management, personnel, company policies, ethical behavior, data falsification, and quality culture.", "excerpt_keywords": "data integrity, pharmaceutical quality system, modernization, control and procedural changes, quality risk management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n###### staying continuously and actively involved in the operations of the business,\n\n###### setting realistic expectations, considering the limitations that place pressures on employees,\n\n###### allocating appropriate technical and personnel resources to meet operational requirements and expectations,\n\n###### implementing fair and just consequences and rewards that promote good cultural attitudes towards ensuring data integrity, and\n\n###### being aware of regulatory trends to apply \"lessons learned\" to the organization.\n\n##### modernising the pharmaceutical quality system\n\n|6.4.1|the application of modern quality risk management principles and good data management practices to the current pharmaceutical quality system serves to modernize the system to meet the challenges that come with the generation of complex data.|\n|---|---|\n|6.4.2|the companys pharmaceutical quality system should be able to prevent, detect and correct weaknesses in the system or their processes that may lead to data integrity lapses. the company should know their data life cycle and integrate the appropriate controls and procedures such that the data generated will be valid, complete and reliable. specifically, such control and procedural changes may be in the following areas:|\n\n- quality risk management,\n- investigation programs,\n- data review practices (section 9),\n- computerised system validation,\n- it infrastructure, services and security (physical and virtual),\n- vendor/contractor management,\n- training program to include companys approach to data governance and data governance sops,\n- storage, processing, transfer and retrieval of completed records, including decentralised/cloud-based data storage, processing and transfer activities,\n- appropriate oversight of the purchase of gmp/gdp critical equipment and it infrastructure that incorporate requirements designed to meet data integrity expectations, e.g. user requirement specifications, (refer section 9.2),\n- self-inspection program to include data quality and integrity, and\n- performance indicators (quality metrics) and reporting to senior management.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a3bf2925-3d16-4976-be12-2c0620a261d6": {"__data__": {"id_": "a3bf2925-3d16-4976-be12-2c0620a261d6", "embedding": null, "metadata": {"page_label": "14", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Optimizing Resource Allocation for Data Integrity Management: Strategies and Best Practices", "questions_this_excerpt_can_answer": "1. What specific strategies does the document suggest for ensuring that management regularly reviews performance indicators related to data integrity, and how does it propose these reviews impact the prioritization of data integrity within an organization's culture?\n\n2. How does the document recommend organizations allocate resources to support and sustain good data integrity management, particularly in relation to preventing workload and pressure from compromising data integrity?\n\n3. What role does the document assign to independent experts in the verification of the effectiveness of an organization's systems and controls for data integrity, and what are the recommended qualifications and training for personnel involved in data management and integrity?", "prev_section_summary": "The section discusses strategies for fostering a positive culture towards ensuring data integrity within a pharmaceutical organization, modernizing the pharmaceutical quality system to address challenges associated with managing complex data, and implementing control and procedural changes to prevent, detect, and correct weaknesses that may lead to data integrity lapses in the pharmaceutical industry. Key topics include quality risk management, investigation programs, data review practices, computerized system validation, IT infrastructure and security, vendor/contractor management, training programs for data governance, storage and retrieval of records, oversight of equipment purchases, self-inspection programs, and performance indicators for reporting to senior management.", "excerpt_keywords": "Data Management, Data Integrity, Resource Allocation, Performance Indicators, Training Programs"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 6.5 regular management review of performance indicators (including quality metrics)\n\n|6.5.1|there should be regular management reviews of performance indicators, including those related to data integrity, such that significant issues are identified, escalated and addressed in a timely manner. caution should be taken when key performance indicators are selected so as not to inadvertently result in a culture in which data integrity is lower in priority.|\n|---|---|\n|6.5.2|the head of the quality unit should have direct access to senior management in order to directly communicate risks so that senior management is aware and can allocate resources to address any issues.|\n|6.5.3|management can have an independent expert periodically verify the effectiveness of their systems and controls.|\n\n## 6.6 resource allocation\n\n|6.6.1|management should allocate appropriate resources to support and sustain good data integrity management such that the workload and pressures on those responsible for data generation and record keeping do not increase the likelihood of errors or the opportunity to deliberately compromise data integrity.|\n|---|---|\n|6.6.2|there should be a sufficient number of personnel for quality and management oversight, it support, conduct of investigations, and management of training programs that are commensurate with the operations of the organization.|\n|6.6.3|there should be provisions to purchase equipment, software and hardware that are appropriate for their needs, based on the criticality of the data in question. companies should implement technical solutions that improve compliance with alcoa+ 5 principles and thus mitigate weaknesses in relation to data quality and integrity.|\n|6.6.4|personnel should be qualified and trained for their specific duties, with appropriate segregation of duties, including the importance of good documentation practices (gdocps). there should be evidence of the effectiveness of training on critical procedures, such as electronic data review. the concept of good data management practices applies to all functional departments that play a role in gmp/gdp, including areas such as it and engineering.|\n|6.6.5|data quality and integrity should be familiar to all, but data quality experts from various levels (smes, supervisors, team leaders) may be called upon to work together to conduct/support investigations, identify system gaps and drive implementation of improvements.|\n|6.6.6|introduction of new roles in an organization relating to good data management such as a data custodian might be considered.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "60fec87f-459c-4791-b671-57d83796da9e": {"__data__": {"id_": "60fec87f-459c-4791-b671-57d83796da9e", "embedding": null, "metadata": {"page_label": "15", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in Pharmaceutical Quality Systems: Principles and Enablers", "questions_this_excerpt_can_answer": "1. What specific steps should be taken when a data integrity lapse is discovered within a pharmaceutical quality system according to the PI 041-1 guidelines from 2021?\n\n2. How does the PI 041-1 document from 2021 suggest handling the documentation of decisions and actions to ensure data integrity in pharmaceutical quality systems, and what key principles does it recommend?\n\n3. What are the additional attributes to the ALCOA principle as outlined in the PI 041-1 2021 document, and how do they contribute to ensuring data integrity in both physical and electronic recordkeeping within the pharmaceutical industry?", "prev_section_summary": "The section discusses strategies and best practices for optimizing resource allocation for data integrity management in regulated environments. Key topics include regular management review of performance indicators related to data integrity, allocation of resources to support good data integrity management, the role of independent experts in verifying system effectiveness, personnel qualifications and training, and the importance of data quality and integrity across all functional departments. Key entities mentioned include senior management, quality unit head, personnel responsible for data generation and record keeping, IT support, data quality experts, and potential new roles such as a data custodian.", "excerpt_keywords": "Data Integrity, Pharmaceutical Quality System, Good Documentation Practices, ALCOA Principle, Regulatory Authorities"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## dealing with data integrity issues found internally\n\n|6.7.1|in the event that data integrity lapses are found, they should be handled as any deviation would be according to the pharmaceutical quality system. it is important to determine the extent of the problem as well as its root cause, then correcting the issue to its full extent and implement preventive measures. this may include the use of a third party for additional expertise or perspective, which may involve a gap assessment to identify weaknesses in the system.|\n|---|---|\n|6.7.2|when considering the impact on patient safety and product quality, any conclusions drawn should be supported by sound scientific evidence.|\n|6.7.3|corrections may include product recall, client notification and reporting to regulatory authorities. corrections and corrective action plans and their implementation should be recorded and monitored.|\n|6.7.4|further guidance may be found in section 12 of this guide.|\n\n## general data integrity principles and enablers\n\n7.1 the pharmaceutical quality system should be implemented throughout the different stages of the life cycle of the apis and medicinal products and should encourage the use of science and risk-based approaches.\n\n7.2 to ensure that decision making is well informed and to verify that the information is reliable, the events or actions that informed those decisions should be well documented. as such, good documentation practices are key to ensuring data integrity, and a fundamental part of a well-designed pharmaceutical quality system (discussed in section 6).\n\n7.3 the application of gdocps may vary depending on the medium used to record the data (i.e. physical vs. electronic records), but the principles are applicable to both. this section will introduce those key principles and following sections (8 & 9) will explore these principles relative to documentation in both paper-based and electronic-based recordkeeping.\n\n7.4 some key concepts of gdocps are summarised by the acronym alcoa: attributable, legible, contemporaneous, original, and accurate. the following attributes can be added to the list: complete, consistent, enduring and available (alcoa+). together, these expectations ensure that events are properly documented and the data can be used to support informed decisions.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "10435c38-1950-4f50-bfdf-0d06c25b3d68": {"__data__": {"id_": "10435c38-1950-4f50-bfdf-0d06c25b3d68", "embedding": null, "metadata": {"page_label": "16", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Principles for Paper and Electronic Systems", "questions_this_excerpt_can_answer": "1. How does the document define the requirement for data to be \"attributable\" within the context of data management and integrity in regulated environments, and what specific examples does it provide to illustrate this principle?\n\n2. In what ways does the document suggest ensuring the \"legibility\" of records, and how does it address the importance of the dynamic nature of electronic data in maintaining the integrity and usefulness of records?\n\n3. What comprehensive strategies does the document recommend for maintaining the \"accuracy\" of records in a pharmaceutical quality system, including the role of equipment, policies, procedures, and deviation management?", "prev_section_summary": "This section discusses the handling of data integrity issues within pharmaceutical quality systems, emphasizing the importance of identifying root causes, implementing corrective actions, and documenting decisions. It also highlights the principles and enablers of data integrity, such as the use of science and risk-based approaches, good documentation practices, and the ALCOA+ attributes for ensuring proper documentation in both physical and electronic recordkeeping. The section provides guidance on dealing with data integrity lapses, ensuring patient safety and product quality, and the importance of recording and monitoring corrective actions.", "excerpt_keywords": "Data Integrity, Paper and Electronic Systems, ALCOA+, Pharmaceutical Quality System, Equipment Validation, Deviation Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## basic data integrity principles applicable to both paper and electronic systems (i.e. alcoa +):\n\n|data integrity attribute|requirement|\n|---|---|\n|attributable|it should be possible to identify the individual or computerized system that performed a recorded task and when the task was performed. this also applies to any changes made to records, such as corrections, deletions, and changes where it is important to know who made a change, when, and why.|\n|legible|all records should be legible - the information should be readable and unambiguous in order for it to be understandable and of use. this applies to all information that would be required to be considered complete, including all original records or entries. where the dynamic nature of electronic data (the ability to search, query, trend, etc.) is important to the content and meaning of the record, the ability to interact with the data using a suitable application is important to the availability of the record.|\n|contemporaneous|the evidence of actions, events or decisions should be recorded as they take place. this documentation should serve as an accurate attestation of what was done, or what was decided and why, i.e. what influenced the decision at that time.|\n|original|the original record can be described as the first-capture of information, whether recorded on paper (static) or electronically (usually dynamic, depending on the complexity of the system). information that is originally captured in a dynamic state should remain available in that state.|\n|accurate|records need to be a truthful representation of facts to be accurate. ensuring records are accurate is achieved through many elements of a robust pharmaceutical quality system. this can be comprised of: - equipment related factors such as qualification, calibration, maintenance and computer validation.\n- policies and procedures to control actions and behaviors, including data review procedures to verify adherence to procedural requirements.\n- deviation management including root cause analysis, impact assessments and capa.\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bf717834-4f43-4073-9eed-670cfcdde9fe": {"__data__": {"id_": "bf717834-4f43-4073-9eed-670cfcdde9fe", "embedding": null, "metadata": {"page_label": "17", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in Pharmaceutical Quality Systems: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What are the key attributes of data integrity as outlined in the document \"Ensuring Data Integrity in Pharmaceutical Quality Systems: Best Practices and Guidelines\" from 2021, and how do they contribute to the reliability of information used in making critical decisions regarding drug products?\n\n2. How does the document \"Ensuring Data Integrity in Pharmaceutical Quality Systems: Best Practices and Guidelines\" define the requirements for a record to be considered \"complete,\" and what role does metadata play in this context?\n\n3. According to the 2021 guidelines in \"Ensuring Data Integrity in Pharmaceutical Quality Systems: Best Practices and Guidelines,\" what measures should be taken to ensure that copies of original paper records, such as analytical summary reports and validation reports, are maintained as \"true copies\" and how should these records be controlled during their life cycle?", "prev_section_summary": "The section discusses basic data integrity principles applicable to both paper and electronic systems, emphasizing the attributes of data integrity such as being attributable, legible, contemporaneous, original, and accurate. It highlights the importance of being able to identify the individual or system responsible for recorded tasks, ensuring records are readable and unambiguous, recording actions as they occur, preserving the original state of information, and maintaining the accuracy of records through various elements of a pharmaceutical quality system. The section provides guidelines for ensuring data management and integrity in regulated environments.", "excerpt_keywords": "Data Integrity, Pharmaceutical Quality Systems, Record Completeness, Metadata, True Copies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## data integrity requirement attribute\n\ntrained and qualified personnel who understand pe importance of following established procedures and documenting peir actions and decisions.\ntogeper, pese elements aim to ensure pe accuracy of information, including scientific data pat is used to make critical decisions about pe quality of products.\n\n## complete\n\nall information that would be critical to recreating an event is important when trying to understand the event. it is important that information is not lost or deleted. the level of detail required for an information set to be considered complete would depend on the criticality of the information (see section 5.4 data criticality). a complete record of data generated electronically includes relevant metadata (see section 9).\n\n## consistent\n\ninformation should be created, processed, and stored in a logical manner that has a defined consistency. this includes policies or procedures that help control or standardize data (e.g. chronological sequencing, date formats, units of measurement, approaches to rounding, significant digits, etc.).\n\n## enduring\n\nrecords should be kept in a manner such that they exist for the entire period during which they might be needed. this means they need to remain intact and accessible as an indelible/durable record throughout the record retention period.\n\n## available\n\nrecords should be available for review at any time during the required retention period, accessible in a readable format to all applicable personnel who are responsible for their review whether for routine release decisions, investigations, trending, annual reports, audits or inspections.\n\nif these elements are appropriately applied to all applicable areas of gmp and gdp related activities, along with other supporting elements of a pharmaceutical quality system, the reliability of the information used to make critical decisions regarding drug products should be adequately assured.\n\n## true copies\n\ncopies of original paper records (e.g. analytical summary reports, validation reports, etc.) are generally very useful for communication purposes, e.g. between companies operating at different locations. these records should be controlled during their life cycle to ensure that the data received from another site (sister company, contractor, etc.) are maintained as \"true copies\"", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "11298353-5d97-483c-9d94-a2b9d0d72763": {"__data__": {"id_": "11298353-5d97-483c-9d94-a2b9d0d72763", "embedding": null, "metadata": {"page_label": "18", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Issuing and Controlling True Copies of Documents: Guidelines and Procedures", "questions_this_excerpt_can_answer": "1. What specific steps should be taken to ensure the integrity and authenticity of a \"true copy\" of a paper document in a regulated environment, according to the guidelines provided in the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" address the creation and control of \"true copies\" of electronic records to prevent the loss of metadata, and what are the recommended practices for distributing these copies?\n\n3. In the context of data management and integrity for medicinal products, what considerations and procedures does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" recommend for retaining electronic records in a dynamic format, and how does it suggest utilizing risk management principles to justify the retention period and format?", "prev_section_summary": "The section discusses key attributes of data integrity, such as the importance of trained personnel, completeness of records, consistency in data processing, endurance of records, and availability of records for review. It emphasizes the need for records to be maintained as \"true copies\" to ensure accuracy and reliability in making critical decisions regarding drug products. The role of metadata in defining record completeness is also highlighted, along with the importance of controlling and maintaining copies of original paper records throughout their life cycle.", "excerpt_keywords": "Keywords: Data Management, Data Integrity, Regulated Environments, True Copies, Electronic Records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nwhere appropriate, or used as a \"summary report\" where the requirements of a \"true copy\" are not met (e.g. summary of complex analytical data).\n\n7.7.2 it is conceivable for raw data generated by electronic means to be retained in an acceptable paper or pdf format, where it can be justified that a static record maintains the integrity of the original data. however, the data retention process should record all data, (including metadata) for all activities which directly or indirectly impact on all aspects of the quality of medicinal products, (e.g. for records of analysis this may include: raw data, metadata, relevant audit trail and result files, software / system configuration settings specific to each analytical run, and all data processing runs (including methods and audit trails) necessary for reconstruction of a given raw data set). it would also require a documented means to verify that the printed records were an accurate representation. this approach is likely to be onerous in its administration to enable a gmp/gdp compliant record.\n\n7.7.3 many electronic records are important to retain in their dynamic format, to enable interaction with the data. data should be retained in a dynamic form where this is critical to its integrity or later verification. risk management principles should be utilised to support and justify whether and how long data should be stored in a dynamic format.\n\n7.7.4 at the receiving site, these records (true copies) may either be managed in a paper or electronic format (e.g., pdf) and should be controlled according to an approved qa procedure.\n\n7.7.5 care should be taken to ensure that documents are appropriately authenticated as \"true copies\" in a manner that allows the authenticity of the document to be readily verified, e.g. through the use of handwritten or electronic signatures or generated following a validated process for creating true copies.\n\n|item|how should the \"true copy\" be issued and controlled?|\n|---|---|\n|1.|creating a \"true copy\" of a paper document. at the company who issues the true copy: - obtain the original of the document to be copied - photocopy the original document ensuring that no information from the original copy is lost; - verify the authenticity of the copied document and sign and date the new hardcopy as a \"true copy\"; the \"true copy\" may now be sent to the intended recipient. creating a \"true copy\" of an electronic document. a true copy of an electronic record should be created by electronic means (electronic file copy), including all required metadata. creating pdf versions of electronic data should be prohibited, where there is the potential for loss of metadata. the \"true copy\" may now be sent to the intended recipient. a distribution list of all issued \"true copies\" (soft/hard) should be maintained.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "909ca3a0-dfb6-4b11-ae7b-7c4aac85abbf": {"__data__": {"id_": "909ca3a0-dfb6-4b11-ae7b-7c4aac85abbf", "embedding": null, "metadata": {"page_label": "19", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Control of True Copies in Record Management: Best Practices and Guidelines\"", "questions_this_excerpt_can_answer": "1. What specific steps should be taken to ensure the integrity and authenticity of true copies generated from original documents in regulated environments, according to the PI 041-1 guidelines from 2021?\n\n2. How does the PI 041-1 document from 2021 address the responsibilities and controls necessary for the generation, transfer, and auditing of \"true copies\" in the context of data integrity between contract givers and receivers?\n\n3. What are the limitations of remote review of summary reports as outlined in the PI 041-1 guidelines from 2021, and how do these limitations impact the control of data integrity in the context of exchanging data between physically remote manufacturing sites and market authorization holders?", "prev_section_summary": "The section discusses guidelines and procedures for issuing and controlling true copies of documents in regulated environments. It covers the steps to ensure the integrity and authenticity of true copies of paper and electronic documents, including the retention of raw data, metadata, audit trails, and software configurations. The document emphasizes the importance of retaining electronic records in a dynamic format when necessary for integrity or verification, and suggests using risk management principles to determine retention periods. It also addresses the authentication of true copies through signatures or validated processes. Key topics include creating true copies, retaining data in dynamic formats, and controlling the distribution of true copies. Key entities mentioned are original documents, true copies, metadata, audit trails, and risk management principles.", "excerpt_keywords": "Data integrity, True copies, Record management, Remote review, Quality agreement"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## specific elements that should be checked when reviewing records:\n\nverify pe procedure for pe generation of true copies, and ensure pat pe generation mepod is controlled appropriately.\ncheck pat true copies issued are identical (complete and accurate) to original records. copied records should be checked against pe original document records to make sure pere is no tampering of pe scanned image.\ncheck pat scanned or saved records are protected to ensure data integrity.\nafter scanning paper records and verifying creation of a true copy:\n- where true copies are generated for distribution purposes, e.g. to be sent to a client, pe original documents from which pe true copies are generated for distribution purposes, e.g. to be sent to a client, pe original documents from which pe scanned images have been created should be retained for pe respective retention periods by pe record owner.\n- where true copies are generated to aid document retention, it may be possible to retain pe copy in place of pe original records documents from which pe scanned images have been created.\n\n## at the company who receives the true copy:\n\n- the paper version, scanned copy or electronic file should be reviewed and filed according to good document management practices.\n\nthe document should clearly indicate that it is a true copy and not an original record.\n\n## specific elements that should be checked when reviewing records:\n\ncheck pat received records are checked and retained appropriately.\na system should be in place to verify pe aupenticity of \"true copies\" e.g. prough verification of pe correct signatories.\n\n## 7.7.6 a quality agreement should be in place to address the responsibilities for the generation and transfer of \"true copies\" and data integrity controls. the system for the issuance and control of \"true copies\" should be audited by the contract giver and receiver to ensure the process is robust and meets data integrity principles.\n\n## 7.8 limitations of remote review of summary reports\n\n## 7.8.1 the remote review of data within summary reports is a common necessity; however, the limitations of remote data review should be fully understood to enable adequate control of data integrity.\n\n## 7.8.2 summary reports of data are often supplied between physically remote manufacturing sites, market authorisation holders and other interested parties. however, it should be acknowledged that summary reports are essentially limited in their nature, in that critical supporting data and metadata is often not included and therefore original data cannot be reviewed.\n\npi 041-1 19 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "34874db2-8d82-48a4-a1f4-ca790b227f8f": {"__data__": {"id_": "34874db2-8d82-48a4-a1f4-ca790b227f8f", "embedding": null, "metadata": {"page_label": "20", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Documentation Control in Pharmaceutical Quality Systems", "questions_this_excerpt_can_answer": "1. What specific steps should be taken prior to accepting summary data to ensure compliance with data integrity principles in regulated environments, according to the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" recommend verifying the authenticity and accuracy of summary data prepared by external entities or different sites within the same organization?\n\n3. What are the key elements outlined in the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" for maintaining data integrity in paper-based systems within pharmaceutical quality systems?", "prev_section_summary": "This section focuses on ensuring data integrity and control of true copies in record management in regulated environments. It outlines specific steps to be taken when reviewing records, such as verifying the procedure for generating true copies, ensuring the authenticity of copied records, and protecting scanned or saved records for data integrity. It also emphasizes the importance of retaining original documents when generating true copies for distribution or document retention purposes. Additionally, it highlights the need for a quality agreement between contract givers and receivers regarding the generation and transfer of true copies, as well as the limitations of remote review of summary reports in maintaining data integrity.", "excerpt_keywords": "Data integrity, Documentation control, Pharmaceutical quality systems, Summary data, Paper-based systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 7.8.3\n\nit is therefore essential that summary reports are viewed as but one element of the process for the transfer of data and that interested parties and inspectorates do not place sole reliance on summary report data.\n\n## 7.8.4\n\nprior to acceptance of summary data, an evaluation of the suppliers quality system and compliance with data integrity principles should be established. it is not normally acceptable nor possible to determine compliance with data integrity principles through the use of a desk-top or similar assessment.\n\n## 7.8.4.1\n\nfor external entities, this should be determined through on-site audit when considered important in the context of quality risk management. the audit should assure the veracity of data generated by the company, and include a review of the mechanisms used to generate and distribute summary data and reports.\n\n## 7.8.4.2\n\nwhere summary data is distributed between different sites of the same organisation, the evaluation of the supplying sites compliance may be determined through alternative means (e.g. evidence of compliance with corporate procedures, internal audit reports, etc.).\n\n## 7.8.5\n\nsummary data should be prepared in accordance with agreed procedures and reviewed and approved by authorised staff at the original site. summaries should be accompanied with a declaration signed by the authorised person stating the authenticity and accuracy of the summary. the arrangements for the generation, transfer and verification of summary reports should be addressed within quality/technical agreements.\n\n## 8\n\nspecific data integrity considerations for paper-based systems\n\n## 8.1\n\nstructure of pharmaceutical quality system and control of blank forms/templates/records\n\n## 8.1.1\n\nthe effective management of paper based documents is a key element of gmp/gdp. accordingly the documentation system should be designed to meet gmp/gdp requirements and ensure that documents and records are effectively controlled to maintain their integrity.\n\n## 8.1.2\n\npaper records should be controlled and should remain attributable, legible, contemporaneous, original and accurate, complete, consistent enduring (indelible/durable), and available (alcoa+) throughout the data lifecycle.\n\n## 8.1.3\n\nprocedures outlining good documentation practices and arrangements for document control should be available within the pharmaceutical quality system. these procedures should specify how data integrity is maintained throughout the lifecycle of the data, including:\n\n- creation, review, and approval of master documents and procedures;\n- generation, distribution and control of templates used to record data (master, logs, etc.);\n- retrieval and disaster recovery processes regarding records;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e4fcd593-2fa9-4620-b99d-6e5d20f21479": {"__data__": {"id_": "e4fcd593-2fa9-4620-b99d-6e5d20f21479", "embedding": null, "metadata": {"page_label": "21", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Document Control and Record Management Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures are recommended to ensure the traceability and controlled issuance of working copies of documents, such as SOPs and blank forms, in regulated environments according to the PI 041-1 guidelines from 2021?\n \n2. How does the document suggest handling the completion of paper-based documents to ensure accuracy, authenticity, and completeness, particularly in terms of identifying individual operators and data entry formats?\n\n3. What are the key expectations outlined for the generation, distribution, and control of records to minimize the risk of inappropriate use or falsification, as per the quality risk management approach recommended in the PI 041-1 guidelines?", "prev_section_summary": "The section discusses the importance of ensuring data integrity and documentation control in pharmaceutical quality systems. It emphasizes the need to verify the authenticity and accuracy of summary data prepared by external entities or different sites within the same organization. Key topics include the evaluation of suppliers' quality systems, compliance with data integrity principles, on-site audits, review of mechanisms for generating and distributing summary data, and the preparation, review, and approval of summary reports. The section also outlines specific data integrity considerations for paper-based systems, such as the structure of the pharmaceutical quality system, control of blank forms/templates/records, and procedures for maintaining data integrity throughout the data lifecycle.", "excerpt_keywords": "Document Control, Data Management, Record Management, Data Integrity, Regulated Environments"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## generation of working copies of documents\n\ngeneration of working copies of documents for routine use, with specific emphasis on ensuring copies of documents, e.g. sops and blank forms are issued and reconciled for use in a controlled and traceable manner.\n\n## completion of paper-based documents\n\ncompletion of paper-based documents, specifying how individual operators are identified, data entry formats, recording amendments, and routine review for accuracy, authenticity, and completeness.\n\n## filing, retrieval, retention, archival, and disposal of records\n\nhandling of filing, retrieval, retention, archival, and disposal of records.\n\n## importance of controlling records\n\n### records are critical to gmp/gdp operations and thus control is necessary to ensure:\n\n- evidence of activities performed\n- evidence of compliance with gmp/gdp requirements and company policies, procedures, and work instructions\n- effectiveness of pharmaceutical quality system\n- traceability\n- process authenticity and consistency\n- evidence of the good quality attributes of the medicinal products manufactured\n- in case of complaints or recalls, records could be used for investigational purposes\n- in case of deviations or test failures, records are critical to completing an effective investigation\n\n## generation, distribution, and control of template records\n\nmanaging and controlling master documents is necessary to ensure that the risk of someone inappropriately using and/or falsifying a record by ordinary means (i.e. not requiring the use of specialist fraud skills) is reduced to an acceptable level. the following expectations should be implemented using a quality risk management approach, considering the risk and criticality of data recorded (see section 5.4, 5.5).\n\n## expectations for the generation, distribution, and control of records\n\n|item|generation|\n|---|---|\n|1.|expectation|\n| |all documents should have a unique identifier (including the version number) and should be checked, approved, signed, and dated. the use of uncontrolled documents should be prohibited by local procedures. the use of temporary recording practices, e.g. scraps of paper should be prohibited.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "775601fe-6fe8-4f0a-a662-6f8f3cd2d4b4": {"__data__": {"id_": "775601fe-6fe8-4f0a-a662-6f8f3cd2d4b4", "embedding": null, "metadata": {"page_label": "22", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Identifying and Mitigating Risks of Inadequate Document Design for Data Entry\"", "questions_this_excerpt_can_answer": "1. How does inadequate document design contribute to the risk of data falsification and omission in regulated environments, according to the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. What specific design features does the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend to ensure manual data entries are clear, legible, and complete, particularly in the context of correcting transcription errors?\n\n3. How does the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document suggest mitigating the risk of using superseded forms and ensuring that critical data is recorded in the correct order, reflecting the operational process and related SOPs?", "prev_section_summary": "The section discusses the importance of document control and record management in regulated environments, focusing on the generation of working copies of documents, completion of paper-based documents, and handling of records. It emphasizes the need for traceability, accuracy, authenticity, and completeness in documentation, as well as the control of records to ensure compliance with GMP/GDP requirements and company policies. The section also highlights the expectations for the generation, distribution, and control of records, including the use of unique identifiers, approval processes, and prohibition of uncontrolled documents or temporary recording practices.", "excerpt_keywords": "Document Design, Data Entry, Data Integrity, Regulated Environments, Good Practices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## potential risk of not meeting expectations/items to be checked\n\nuncontrolled documents increase pe potential for omission or loss of critical data as pese documents may be discarded or destroyed wipout traceability. in addition, uncontrolled records may not be designed to correctly record critical data.\nit might be easier to falsify uncontrolled records.\nuse of temporary recording practices may lead to data omission, and pese temporary original records are not specified for retention.\nif records can be created and accessed wipout control, it is possible pat pe records may not have been recorded at pe time pe event occurred.\nthere is a risk of using superseded forms if pere is no version control or controls for issuance.\n\n## expectation\n\nthe document design should provide sufficient space for manual data entries.\n\n## potential risk of not meeting expectations/items to be checked\n\nhandwritten data may not be clear and legible if pe spaces provided for data entry are not sufficiently sized.\ndocuments should be designed to provide sufficient space for comments, e.g. in case of a transcription error, pere should be sufficient space for pe operator to cross out, initial and date pe error, and record any explanation required.\nif additional pages of pe documents are added to allow complete documentation, pe number of, and reference to any pages added should be clearly documented on pe main record page and signed.\nsufficient space should be provided in pe document format to add all necessary data, and data should not be recorded haphazardly on pe document, for example to avoid recording on pe reverse of printed recording on pe reverse of printed pages which are not intended for pis purpose.\n\n## expectation\n\nthe document design should make it clear what data is to be provided in entries.\n\n## potential risks of not meeting expectations/items to be checked\n\nambiguous instructions may lead to inconsistent/incorrect recording of data.\ngood design ensures all critical data is recorded and ensures clear, contemporaneous and enduring (indelible/durable) completion of entries.\nthe document should also be structured in such a way as to record information in pe same order as pe operational process and related sop, to minimize pe risk of inadvertently omitting critical data.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ae2290dc-dd59-4404-8166-d139f17b1515": {"__data__": {"id_": "ae2290dc-dd59-4404-8166-d139f17b1515", "embedding": null, "metadata": {"page_label": "23", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Document Control and Distribution Policy and Procedures", "questions_this_excerpt_can_answer": "1. What specific measures are recommended to distinguish master documents from copies in a regulated environment, according to the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n \n2. How does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document suggest controlling access to master templates stored electronically to prevent unauthorized changes?\n\n3. What procedures does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend for the issuance of documents to ensure control and prevent the use of obsolete versions?", "prev_section_summary": "The section discusses the risks associated with inadequate document design for data entry in regulated environments, as outlined in the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document. It highlights the potential risks of uncontrolled documents, unclear and illegible handwritten data, and ambiguous instructions leading to inconsistent or incorrect data recording. The document recommends specific design features to ensure manual data entries are clear, legible, complete, and recorded in the correct order to mitigate the risk of data falsification, omission, and using superseded forms. It emphasizes the importance of providing sufficient space for data entries, clear instructions on what data to provide, and structuring the document to reflect the operational process and related SOPs.", "excerpt_keywords": "Document Control, Data Management, Integrity, Regulated Environments, Distribution Policy"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n#### expectation\n\ndocuments should be stored in a manner which ensures appropriate version control.\n\nmaster documents should contain distinctive marking so to distinguish the master from a copy, e.g. use of coloured papers or inks so as to prevent inadvertent use.\n\nmaster documents (in electronic form) should be prevented from unauthorised or inadvertent changes.\n\ne.g.: for the template records stored electronically, the following precautions should be in place:\n\n- access to master templates should be controlled;\n- process controls for creating and updating versions should be clear and practically applied/verified; and\n- master documents should be stored in a manner which prevents unauthorised changes.\n\npotential risk of not meeting expectations/items to be checked\n\n- inappropriate storage conditions can allow unauthorised modification, use of expired and/or draft documents or cause the loss of master documents.\n- the processes of implementation and the effective communication, by way of appropriate training prior to implementation when applicable, are just as important as the document.\n\n#### item distribution and control\n\n##### expectations\n\nupdated versions should be distributed in a timely manner.\n\nobsolete master documents and files should be archived and their access restricted.\n\nany issued and unused physical documents should be retrieved and reconciled.\n\nwhere authorised by quality, recovered copies of documents may be destroyed. however, master copies of authorised documents should be preserved.\n\npotential risk of not meeting expectations/items to be checked\n\n- there may be a risk that obsolete versions can be used by mistake if available for use.\n\n##### expectation\n\ndocument issuance should be controlled by written procedures that include the following controls:\n\n- details of who issued the copies and when they were issued;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ea2a6322-9fb9-4522-b8e9-4bae3b25f1e5": {"__data__": {"id_": "ea2a6322-9fb9-4522-b8e9-4bae3b25f1e5", "embedding": null, "metadata": {"page_label": "24", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Document Control and Record Management in Pharmaceutical Quality Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific methods are recommended for differentiating approved copies of documents in a regulated pharmaceutical environment to prevent unauthorized use or falsification?\n\n2. How does the document suggest managing the issuance and control of blank templates, particularly in the context of critical Good Manufacturing Practice (GMP) or Good Distribution Practice (GDP) forms, to ensure the accuracy and completeness of records?\n\n3. What are the guidelines for maintaining an index of authorized master documents within a pharmaceutical quality system, and what key information should this index include for each type of template record?", "prev_section_summary": "This section discusses the importance of document control and distribution policies and procedures in regulated environments to ensure data management and integrity. It emphasizes the need for appropriate version control, distinguishing master documents from copies, preventing unauthorized changes to electronic master templates, and controlling access to master documents. The section also highlights the risks of not meeting these expectations, such as unauthorized modifications, use of expired documents, and the importance of clear procedures for document issuance to prevent the use of obsolete versions. Key topics include storage of documents, access control, distribution of updated versions, archiving obsolete documents, and controls for document issuance. Key entities mentioned are master documents, copies, electronic templates, and issued physical documents.", "excerpt_keywords": "Document Control, Record Management, Pharmaceutical Quality Systems, Data Integrity, Regulated Environments"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### 8.4 document control\n\n- clear means of differentiating approved copies of documents, e.g. by use of a secure stamp, or paper colour code not available in the working areas or another appropriate system;\n\n- ensuring that only the current approved version is available for use;\n\n- allocating a unique identifier to each blank document issued and recording the issue of each document in a register;\n\n- numbering every distributed copy (e.g.: copy 2 of 2) and sequential numbering of issued pages in bound books;\n\n- where the re-issue of additional copies of the blank template is necessary, a controlled process regarding re-issue should be followed with all distributed copies maintained and a justification and approval for the need of an extra copy recorded, e.g.: \"the original template record was damaged\";\n\n- critical gmp/gdp blank forms (e.g.: worksheets, laboratory notebooks, batch records, control records) should be reconciled following use to ensure the accuracy and completeness of records; and\n\n- where copies of documents other than records, (e.g. procedures), are printed for reference only, reconciliation may not be required, providing the documents are time-stamped on generation, and their short-term validity marked on the document.\n\npotential risk of not meeting expectations/items to be checked\n\n- without the use of security measures, there is a risk that rewriting or falsification of data may be made after photocopying or scanning the template record (which gives the user another template copy to use).\n- obsolete versions can be used intentionally or by error.\n- a filled record with an anomalous data entry could be replaced by a new rewritten template.\n- all unused forms should be accounted for, and either defaced and destroyed, or returned for secure filing.\n- check that (where used) reference copies of documents are clearly marked with the date of generation, period of validity and clear indication that they are for reference only and not an official copy, e.g. marked uncontrolled when printed.\n\n### 8.4.1 index of authorized master documents\n\nan index of all authorized master documents, (sops, forms, templates and records) should be maintained within the pharmaceutical quality system. this index should mention for each type of template record at least the following information: title, identifier including version number, location (e.g. documentation database, effective date, next review date, etc.).\n\n### 8.5 use and control of records located at the point-of-use\n\nrecords should be available to operators at the point-of-use and appropriate controls should be in place to manage these records. these controls should be carried out to minimize the risk of damage or loss of the records and ensure data integrity. where necessary, measures should be taken to protect records from being soiled (e.g. getting wet or stained by materials, etc.).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "81d5f905-e73f-4c1b-b992-8b522f1543a4": {"__data__": {"id_": "81d5f905-e73f-4c1b-b992-8b522f1543a4", "embedding": null, "metadata": {"page_label": "25", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Managing Records with Scribes: Control and Completion", "questions_this_excerpt_can_answer": "1. What specific guidelines does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" provide for the use of scribes in documenting records in regulated environments?\n \n2. How does the document address the handling of unused, blank fields within documents to ensure data integrity, according to the excerpt from \"Managing Records with Scribes: Control and Completion\"?\n\n3. What are the conditions under which the document permits the use of a second person, or scribe, to record activities on behalf of another operator in regulated environments, and what procedures must be followed in such cases?", "prev_section_summary": "This section focuses on document control and record management in pharmaceutical quality systems. Key topics include methods for differentiating approved copies of documents, managing the issuance and control of blank templates, reconciling critical Good Manufacturing Practice (GMP) or Good Distribution Practice (GDP) forms, and maintaining an index of authorized master documents. The section emphasizes the importance of security measures to prevent unauthorized use or falsification of documents, as well as the need for controls to ensure the accuracy and completeness of records. Additionally, it highlights the importance of maintaining an index of authorized master documents with key information for each type of template record.", "excerpt_keywords": "Keywords: Data Management, Integrity, Regulated Environments, Scribes, Record Control"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 8.5.2 records\n\nrecords should be appropriately controlled in these areas by designated persons or processes in accordance with written procedures.\n\n## 8.6 filling out records\n\n### 8.6.1 completion of records\n\n|item|completion of records|\n|---|---|\n|1.|- handwritten entries should be made by the person who executed the task.\n- unused, blank fields within documents should be voided (e.g. crossed-out), dated and signed.\n- handwritten entries should be made in clear and legible writing.\n- the completion of date fields should be done in an unambiguous format defined for the site. e.g. dd/mm/yyyy or mm/dd/yyyy.\n- potential risk of not meeting expectations/items to be checked:\n|\n|2.|records relating to operations should be completed contemporaneously.|\n\nscribes may only be used in exceptional circumstances, refer footnote 8.\n\nthe use of scribes (second person) to record activity on behalf of another operator should be considered exceptional, and only take place where:\n\n- the act of recording places the product or activity at risk e.g. documenting line interventions by sterile operators.\n- to accommodate cultural or staff literacy / language limitations, for instance where an activity is performed by an operator, but witnessed and recorded by a scribe. in these cases, bilingual or controlled translations of documents into local languages and dialect are advised.\n\nin both situations, the scribe recording should be contemporaneous with the task being performed, and should identify both the person performing the observed task and the person completing the record. the person performing the observed task should countersign the record wherever possible, although it is accepted that this countersigning step will be retrospective. the process for a scribe to complete documentation should be described in an approved procedure, which should specify the activities to which the process applies and assesses the risks associated.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c09f7dd1-d8b2-48d6-b06b-a9dae0ca0327": {"__data__": {"id_": "c09f7dd1-d8b2-48d6-b06b-a9dae0ca0327", "embedding": null, "metadata": {"page_label": "26", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Guidelines for Creating and Maintaining Enduring and Attributable Records in Compliance with Expectations\"", "questions_this_excerpt_can_answer": "1. What specific guidelines does the document provide for ensuring the indelibility of records in regulated environments, particularly regarding the type of writing instruments and materials to be used?\n \n2. How does the document address the authentication of records through signatures and unique identifiers, including the management and verification of signature and initials logs?\n\n3. What procedure does the document recommend for making corrections to records in a way that maintains full traceability and integrity of the data?", "prev_section_summary": "The section discusses the management of records with scribes in regulated environments, focusing on the completion of records, the use of scribes in exceptional circumstances, and the procedures to be followed when a scribe is used to record activities on behalf of another operator. Key topics include the control of records, completion of records, contemporaneous completion of records, and the conditions under which scribes may be used. Entities mentioned include designated persons or processes for record control, the person executing the task, scribes, operators, and the process for scribes to complete documentation.", "excerpt_keywords": "Keywords: Data management, Integrity, Regulated environments, Records, Guidelines"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### potential risk of not meeting expectations/items to be checked\n\nverify pat records are available wipin pe immediate areas in which pey are used, i.e. inspectors should expect pat sequential recording can be performed at pe site of operations. if pe form is not available at pe point of use, pis will not allow operators to fill in records at pe time of occurrence.\n\n### expectation\n\nrecords should be enduring (indelible).\n\n### potential risk of not meeting expectations/items to be checked\n\n- check that written entries are in ink, which is not erasable, and/or will not smudge or fade (during the retention period).\n- check that the records were not filled out using pencil prior to use of pen (overwriting).\n- note that some paper printouts from systems may fade over time, e.g. thermal paper. indelible signed and dated true copies of these should be produced and kept.\n\n### expectation\n\nrecords should be signed and dated using a unique identifier that is attributable to the author.\n\n### potential risk of not meeting expectations/items to be checked\n\n- check that there are signature and initials logs, that are controlled and current and that demonstrate the use of unique examples, not just standardized printed letters.\n- ensure that all key entries are signed & dated, particularly if steps occur over time, i.e. not just signed at the end of the page and/or process.\n- the use of personal seals is generally not encouraged; however, where used, seals should be controlled for access. there should be a log which clearly shows traceability between an individual and their personal seal. use of personal seals should be dated (by the owner), to be deemed acceptable.\n\n### making corrections on records\n\ncorrections to the records should be made in such way that full traceability is maintained.\n\n|item|how should records be corrected?|\n|---|---|\n|1|cross out what is to be changed with a single line.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c78ecd61-fff0-4d59-a17e-5386fd875cf1": {"__data__": {"id_": "c78ecd61-fff0-4d59-a17e-5386fd875cf1", "embedding": null, "metadata": {"page_label": "27", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Verification of Records in Critical Process Steps", "questions_this_excerpt_can_answer": "1. What specific steps are outlined for the verification of records related to critical process steps in batch production, according to the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" recommend handling the review and approval process for laboratory records associated with testing steps to ensure data integrity?\n\n3. What additional controls does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" suggest considering when critical test interpretations are made by a single individual, especially in the context of recording microbial colonies on agar?", "prev_section_summary": "The section provides guidelines for ensuring the indelibility of records in regulated environments, including the use of specific writing instruments and materials. It also addresses the authentication of records through signatures and unique identifiers, as well as the procedure for making corrections to records while maintaining traceability and integrity of the data. Key topics include the enduring nature of records, the use of ink for entries, the importance of unique identifiers for signatures, and the correct method for making corrections to records. Key entities mentioned include inspectors, operators, authors of records, and individuals using personal seals.", "excerpt_keywords": "Verification, Records, Critical Process Steps, Data Integrity, Batch Production"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## verification of records (secondary checks)\n\n|item|when and who should verify the records?|\n|---|---|\n|1.|expectation records of critical process steps, e.g. critical steps within batch records, should be: - reviewed/witnessed by independent and designated personnel at the time of operations occurring; and - reviewed by an approved person within the production department before sending them to the quality unit; and - reviewed and approved by the quality unit (e.g. authorised person / qualified person) before release or distribution of the batch produced. batch production records of non-critical process steps is generally reviewed by production personnel according to an approved procedure. laboratory records for testing steps should also be reviewed by designated personnel (e.g.: second analysts) following completion of testing. reviewers are expected to check all entries, critical calculations, and undertake appropriate assessment of the reliability of test results in accordance with data-integrity principles. additional controls should be considered when critical test interpretations are made by a single individual (e.g. recording of microbial colonies on agar|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0431a216-8edb-4a40-b139-248dc972b3eb": {"__data__": {"id_": "0431a216-8edb-4a40-b139-248dc972b3eb", "embedding": null, "metadata": {"page_label": "28", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Quality Assurance Procedures for Production Records and Data Review", "questions_this_excerpt_can_answer": "1. What specific steps should be taken to ensure the integrity of production records and data during the review process according to the PI 041-1 guidelines?\n \n2. How does PI 041-1 recommend verifying critical data in regulated environments, and what alternatives does it suggest for traditional verification methods?\n\n3. According to the document PI 041-1, what criteria should be used to determine the need for and extent of secondary checks in the review process of manual data within regulated environments?", "prev_section_summary": "The section discusses the verification of records in critical process steps, outlining specific steps for ensuring data integrity in batch production and laboratory testing. Key topics include the review and approval process for records, involvement of independent personnel, designated reviewers, and controls for critical test interpretations. Entities mentioned include production personnel, quality unit, designated personnel for reviewing laboratory records, and the importance of data-integrity principles in verifying records.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Quality Assurance, Production Records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nplates). a secondary review may be required in accordance with risk management principles. in some cases this review may need to be performed in real-time. suitable electronic means of verifying critical data may be an acceptable alternative, e.g. taking photograph images of the data for retention. this verification should be conducted after performing production-related tasks and activities and be signed or initialled and dated by the appropriate persons. local sops should be in place to describe the process for review of written documents. specific elements that should be checked when reviewing records:\n\n- verify the process for the handling of production records within processing areas to ensure they are readily available to the correct personnel at the time of performing the activity to which the record relates.\n- verify that any secondary checks performed during processing were performed by appropriately qualified and independent personnel, e.g. production supervisor or qa.\n- check that documents were reviewed by production personnel and then quality assurance personnel following completion of operational activities.\n\n|item|how should records be verified?|\n|---|---|\n|2.|expectation check that all the fields have been completed correctly using the current (approved) templates, and that the data was critically compared to the acceptance criteria. check items 1, 2, 3, and 4 of section 8.6 and items 1 and 2 of section 8.7 specific elements that should be checked when reviewing records:|\n| |inspectors should review company procedures for the review of manual data to determine the adequacy of processes. the need for, and extent of a secondary check should be based on quality risk management principles, based on the criticality of the data generated. check that the secondary reviews of data include a verification of any calculations used. view original data (where possible) to confirm that the correct data was transcribed for the calculation.|\n\npi 041-1 28 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d51f7448-e987-496e-84c6-7e191f4183a1": {"__data__": {"id_": "d51f7448-e987-496e-84c6-7e191f4183a1", "embedding": null, "metadata": {"page_label": "29", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Electronic Systems and Document Retention in GMP/GDP Compliance: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific steps and considerations are recommended for ensuring the integrity and traceability of directly-printed paper records from simple electronic systems like balances and pH meters in regulated environments, according to the PI 041-1 guidelines from 2021?\n\n2. According to the 2021 PI 041-1 guidelines, what are the key elements that should be included in a system for archiving records in compliance with GMP/GDP requirements, and how should the effectiveness and traceability of this system be evaluated?\n\n3. How does the PI 041-1 document from 2021 address the retention and archiving of records in relation to meeting both GMP/GDP requirements and additional local or national legislation, and what considerations should be made for using outside storage services for these purposes?", "prev_section_summary": "The section discusses the importance of ensuring the integrity of production records and data during the review process in regulated environments according to the PI 041-1 guidelines. It highlights the need for secondary reviews based on risk management principles and suggests using electronic means for verifying critical data as an alternative to traditional methods. The section also outlines specific criteria for determining the need for and extent of secondary checks in the review process of manual data, emphasizing the verification of completed fields, comparison to acceptance criteria, and confirmation of correct data transcription for calculations. Additionally, it mentions the involvement of production and quality assurance personnel in the review process and the importance of following local SOPs for document review procedures.", "excerpt_keywords": "Electronic Systems, Document Retention, GMP, GDP, Data Integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 8.9 direct print-outs from electronic systems\n\n8.9.1 some very simple electronic systems, e.g. balances, ph meters or simple processing equipment which do not store data, generate directly-printed paper records. these types of systems and records provide limited opportunity to influence the presentation of data by (re-)processing, changing of electronic date/time stamps. in these circumstances, the original record should be signed and dated by the person generating the record and information to ensure traceability, such as sample id, batch number, etc. should be recorded on the record. these original records should be attached to batch processing or testing records.\n\n8.9.2 consideration should be given to ensuring these records are enduring (see section 8.6.1).\n\n## 8.10 document retention (identifying record retention requirements and archiving records)\n\n8.10.1 the retention period of each type of records should (at a minimum) meet those periods specified by gmp/gdp requirements. consideration should be given to other local or national legislation that may stipulate longer storage periods.\n\n8.10.2 the records can be retained internally or by using an outside storage service subject to quality agreements. in this case, the data centres locations should be identified. a risk assessment should be available to demonstrate retention systems/facilities/services are suitable and that the residual risks are understood.\n\n|item|where and how should records be archived?|\n|---|---|\n|1.|expectation a system should be in place describing the different steps for archiving records (identification of archive boxes, list of records by box, retention period, archiving location, etc.). instructions regarding the controls for storage, as well as access and recovery of records should be in place. systems should ensure that all gmp/gdp relevant records are stored for periods that meet gmp/gdp requirements. specific elements that should be checked when reviewing records: - check that the system implemented for retrieving archived records is effective and traceable.\n- check if the records are stored in an orderly manner and are easily identifiable.\n- check that records are in the defined location and appropriately secured.\n|\n\nnote that storage periods for some documents may be dictated by other local or national legislation.\n\npi 041-1 29 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "041b4372-f64e-4523-9a0f-6bbf6765521f": {"__data__": {"id_": "041b4372-f64e-4523-9a0f-6bbf6765521f", "embedding": null, "metadata": {"page_label": "30", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Record Management and Protection Policy and Disposal Process", "questions_this_excerpt_can_answer": "1. What specific measures should be taken to ensure the durability and retrievability of hardcopy quality records in regulated environments, according to the PI 041-1 Good Practices for Data Management and Integrity document from 2021?\n\n2. How does the PI 041-1 document from 2021 recommend protecting records from potential damage or destruction, and what are the specific elements that should be checked when reviewing records for their protection in regulated environments?\n\n3. What guidelines does the PI 041-1 document from 2021 provide for the disposal of original records or true copies in regulated environments to prevent accidental destruction of current records or the inadvertent reintroduction of historical records into the current record stream?", "prev_section_summary": "This section discusses the practices for ensuring data integrity and traceability in regulated environments, specifically focusing on direct print-outs from simple electronic systems like balances and pH meters. It emphasizes the importance of signing and dating original records, including relevant information for traceability, and attaching them to batch processing or testing records. The section also addresses document retention requirements, highlighting the need to meet GMP/GDP requirements and consider additional local or national legislation for storage periods. It suggests archiving records internally or using outside storage services, with a system in place for identifying archive boxes, retention periods, and controls for storage, access, and recovery of records. The effectiveness and traceability of the archiving system should be evaluated, ensuring compliance with GMP/GDP requirements.", "excerpt_keywords": "Record Management, Data Integrity, Regulated Environments, Disposal Process, Document Protection"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## expectations\n\nall hardcopy quality records should be archived in secure locations to prevent damage or loss, in such a manner that it is easily traceable and retrievable, and in a manner that ensures that records are durable for their archived life.\n\nspecific elements that should be checked when reviewing records:\n\n- check for the outsourced archived operations if there is a quality agreement in place and if the storage location was audited.\n- ensure there is some assessment of ensuring that documents will still be legible/available for the entire archival period.\n- in case of printouts which are not permanent (e.g. thermal transfer paper), a verified (true) copy should be retained.\n- verify whether the storage methods used permit efficient retrieval of documents when required.\n\n## expectations\n\nall records should be protected from damage or destruction by:\n\n- fire;\n- liquids (e.g. water, solvents, and buffer solution);\n- rodents;\n- humidity etc; and\n- unauthorized personnel access, who may attempt to amend, destroy, or replace records.\n\nspecific elements that should be checked when reviewing records:\n\ncheck if there are systems in place to protect records (e.g. pest control and sprinklers).\nnote: sprinkler systems should be implemented according to local safety requirements; however, they should be designed to prevent damage to documents, e.g. documents are protected from water.\ncheck for appropriate access controls for records.\n\n## disposal of original records or true copies\n\na documented process for the disposal of records should be in place to ensure that the correct original records or true copies are disposed of after the defined retention period. the system should ensure that current records are not destroyed by accident and that historical records do not inadvertently make their way back into the current record stream (e.g. historical records confused/mixed with existing records.)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f525b0ee-7da5-4291-9844-d470254b78c7": {"__data__": {"id_": "f525b0ee-7da5-4291-9844-d470254b78c7", "embedding": null, "metadata": {"page_label": "31", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity in Pharmaceutical Quality Systems through Effective Management of Computerised Systems\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended by the PI 041-1 document to ensure the integrity of data within pharmaceutical quality systems, particularly in relation to the management of computerised systems?\n\n2. How does the PI 041-1 document suggest regulated entities should approach the archiving or destruction of retired records to maintain data integrity within regulated environments?\n\n3. According to the PI 041-1 document, what considerations should be made when designing, evaluating, and selecting computerised systems in the pharmaceutical industry to ensure they meet GMP and GDP requirements, including data integrity aspects?", "prev_section_summary": "The section discusses the expectations and specific measures for managing and protecting hardcopy quality records in regulated environments according to the PI 041-1 document from 2021. Key topics include archiving records securely, ensuring durability and retrievability, protecting records from damage or destruction, and the disposal process for original records or true copies. Entities mentioned include outsourced archived operations, quality agreements, storage methods, protection systems against fire, liquids, rodents, humidity, and unauthorized access, as well as access controls and disposal processes to prevent accidental destruction or reintroduction of historical records.", "excerpt_keywords": "Data Integrity, Pharmaceutical Quality Systems, Computerised Systems, GMP, GDP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 8.11.2\n\na record/register should be available to demonstrate appropriate and timely archiving or destruction of retired records in accordance with local policies.\n\n## 8.11.3\n\nmeasures should be in place to reduce the risk of deleting the wrong documents. the access rights allowing disposal of records should be controlled and limited to few persons.\n\n## 9 specific data integrity considerations for computerised systems\n\n### 9.1 structure of the pharmaceutical quality system and control of computerised systems\n\n#### 9.1.1\n\na large variety of computerised systems are used by companies to assist in a significant number of operational activities. these range from the simple standalone to large integrated and complex systems, many of which have an impact on the quality of products manufactured. it is the responsibility of each regulated entity to fully evaluate and control all computerised systems and manage them in accordance with gmp 10 and gdp 11 requirements.\n\n#### 9.1.2\n\norganisations should be fully aware of the nature and extent of computerised systems utilised, and assessments should be in place that describe each system, its intended use and function, and any data integrity risks or vulnerabilities that may be susceptible to manipulation. particular emphasis should be placed on determining the criticality of computerised systems and any associated data, in respect of product quality.\n\n#### 9.1.3\n\nall computerised systems with potential for impact on product quality should be effectively managed under a pharmaceutical quality system which is designed to ensure that systems are protected from acts of accidental or deliberate manipulation, modification or any other activity that may impact on data quality and integrity.\n\n#### 9.1.4\n\nthe processes for the design, evaluation, and selection of computerised systems should include appropriate consideration of the data management and integrity aspects of the system. regulated users should ensure that vendors of systems have an adequate understanding of gmp/gdp and data integrity requirements, and that new systems include appropriate controls to ensure effective data management. legacy systems are expected to meet the same basic requirements; however, full compliance may necessitate the use of additional controls, e.g. supporting administrative procedures or supplementary security hardware/software.\n\n#### 9.1.5\n\nregulated users should fully understand the extent and nature of data generated by computerised systems, and a risk-based approach should be taken to determining the data risk and criticality of data (including metadata) and the subsequent controls required to manage the data generated. for example:\n\n|10 pic/s pe 009|guide to good manufacturing practice for medicinal products, specifically part i chapters 4, part ii chapters 5, & annex 11|\n|---|---|\n|11 pic/s pe 011|gdp guide to good distribution practice for medicinal products, specifically section 3.5|\n\npi 041-1 31 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4cfedc43-b292-4e46-b2ad-caf13162bafe": {"__data__": {"id_": "4cfedc43-b292-4e46-b2ad-caf13162bafe", "embedding": null, "metadata": {"page_label": "32", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity and Governance in Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific types of metadata are considered critical for the reconstruction of events in regulated environments, and how should they be managed according to the guidelines provided in the document \"Data Integrity and Governance in Computerized Systems: A Comprehensive Guide\"?\n\n2. How does the document suggest handling the vulnerabilities and risks associated with older electronic systems that may not have up-to-date security measures, especially in the context of ensuring data integrity and governance in computerized systems?\n\n3. What are the recommended practices for the qualification and validation of computerized systems to ensure good data governance practices, as outlined in the \"Data Integrity and Governance in Computerized Systems: A Comprehensive Guide\"?", "prev_section_summary": "This section discusses the importance of ensuring data integrity in pharmaceutical quality systems through effective management of computerized systems. It emphasizes the need for regulated entities to have measures in place for appropriate archiving or destruction of retired records, as well as controls to reduce the risk of deleting the wrong documents. The section also highlights specific data integrity considerations for computerized systems, including the structure of the pharmaceutical quality system, evaluation and selection of computerized systems, and understanding the nature and extent of data generated by these systems. It stresses the importance of managing computerized systems to protect them from manipulation or modification that could impact data quality and integrity. Regulatory guidelines such as GMP and GDP are referenced throughout the section to ensure compliance with industry standards.", "excerpt_keywords": "Data Integrity, Governance, Computerized Systems, Regulated Environments, Good Practices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 9.1.5.1\n\nin dealing with raw data, the complete capture and retention of raw data would normally be required in order to reconstruct the manufacturing event or analysis.\n\n## 9.1.5.2\n\nin dealing with metadata, some metadata is critical in the reconstruction of events (e.g. user identification, times, critical process parameters, units of measure) and would be considered as relevant metadata that should be fully captured and managed. however, non-critical metadata such as system error logs or non-critical system checks may not require full capture and management where justified using risk management.\n\n## 9.1.6\n\nwhen determining data vulnerability and risk, it is important that the computerized system is considered in the context of its use within the business process. for example, the integrity of results generated by an analytical method utilizing an integrated computer interface are affected by sample preparation, entry of sample weights into the system, use of the system to generate data, and processing/recording of the final result using that data. the creation and assessment of a data flow map may be useful in understanding the risks and vulnerabilities of computerized systems, particularly interfaced systems.\n\n## 9.1.7\n\nconsideration should be given to the inherent data integrity controls incorporated into the system and/or software, especially those that may be more vulnerable to exploits than more modern systems that have been designed to meet contemporary data management requirements. examples of systems that may have vulnerabilities include: manual recording systems, older electronic systems with obsolete security measures, non-networked electronic systems, and those that require additional network security protection e.g. using firewalls and intrusion detection or prevention systems.\n\n## 9.1.8\n\nduring inspection of computerized systems, inspectors are recommended to utilize the companys expertise during assessment. asking and instructing the companys representatives to facilitate access and navigation can aid in the inspection of the system.\n\n## 9.1.9\n\nthe guidance herein is intended to provide specific considerations for data integrity in the context of computerized systems. further guidance regarding good practices for computerized systems may be found in the pic/s good practices for computerized systems in regulated \"gxp\" environments (pi 011).\n\n## 9.1.10\n\nthe principles herein apply equally to circumstances where the provision of computerized systems is outsourced. in these cases, the regulated entity retains the responsibility to ensure that outsourced services are managed and assessed in accordance with gmp/gdp requirements, and that appropriate data management and integrity controls are understood by both parties and effectively implemented.\n\n## 9.2\n\n### qualification and validation of computerized systems\n\n## 9.2.1\n\nthe qualification and validation of computerized systems should be performed in accordance with the relevant gmp/gdp guidelines; the tables below provide clarification regarding specific expectations for ensuring good data governance practices for computerized systems.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e3fd53ba-55d0-4b4d-930f-fbfc3d2f2438": {"__data__": {"id_": "e3fd53ba-55d0-4b4d-930f-fbfc3d2f2438", "embedding": null, "metadata": {"page_label": "33", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "System Validation and Data Integrity Maintenance Guidelines", "questions_this_excerpt_can_answer": "1. What specific steps should regulated companies take during the initial stages of system procurement to ensure data management and integrity requirements are met, according to the guidelines provided in the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How should legacy systems be evaluated to ensure they meet the necessary data management and integrity controls as outlined in the \"System Validation and Data Integrity Maintenance Guidelines\"?\n\n3. What are some of the additional controls that can be implemented if a system's functionality or design does not inherently provide an appropriate level of control for data integrity, as recommended in the document?", "prev_section_summary": "This section focuses on the importance of data integrity and governance in computerized systems in regulated environments. Key topics covered include the capture and retention of raw data, management of critical metadata for event reconstruction, assessment of data vulnerability and risk, consideration of inherent data integrity controls in older systems, inspection of computerized systems, outsourcing of computerized system services, and the qualification and validation of computerized systems according to GMP/GDP guidelines. The section emphasizes the need for thorough data management practices and understanding of risks associated with computerized systems to ensure data integrity and compliance with regulatory requirements.", "excerpt_keywords": "Data Management, Data Integrity, System Validation, Legacy Systems, Regulatory Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## validation and maintenance\n\n|item|system validation & maintenance|\n|---|---|\n|1.|expectation regulated companies should document and implement appropriate controls to ensure that data management and integrity requirements are considered in the initial stages of system procurement and throughout system and data lifecycle. for regulated users, functional specifications (fs) and/or user requirement specifications (urs) should adequately address data management and integrity requirements. specific attention should be paid to the purchase of gmp/gdp critical equipment to ensure that systems are appropriately evaluated for data integrity controls prior to purchase. legacy systems (existing systems in use) should be evaluated to determine whether existing system configuration and functionality permits the appropriate control of data in accordance with good data management and integrity practices. where system functionality or design of these systems does not provide an appropriate level of control, additional controls should be considered and implemented. potential risk of not meeting expectations/items to be checked - inadequate consideration of di requirements may result in the purchase of software systems that do not include the basic functionality required to meet data management and integrity expectations. - inspectors should verify that the implementation of new systems followed a process that gave adequate consideration to di principles. - some legacy systems may not include appropriate controls for data management, which may allow the manipulation of data with a low probability of detection. - assessments of existing systems should be available and provide an overview of any vulnerabilities and list any additional controls implemented to assure data integrity. additional controls should be appropriately validated and may include: - using operating system functionality (e.g. windows active directory groups) to assign users and their access privileges where system software does not include administrative controls to control user privileges; - configuring operating system file/folder permissions to prevent modification/deletion of files when the|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "44b218cb-0fec-4326-9faa-bbeb552daf77": {"__data__": {"id_": "44b218cb-0fec-4326-9faa-bbeb552daf77", "embedding": null, "metadata": {"page_label": "34", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Guidelines for Ensuring Compliance and Control of Computerised Systems for Data Integrity and Product Quality\"", "questions_this_excerpt_can_answer": "1. What specific criteria should regulated users consider when creating an inventory of all computerised systems in use within a regulated environment, according to the \"Guidelines for Ensuring Compliance and Control of Computerised Systems for Data Integrity and Product Quality\"?\n\n2. How does the document \"Guidelines for Ensuring Compliance and Control of Computerised Systems for Data Integrity and Product Quality\" suggest regulated users assess the risk and validation requirements for computerised systems, particularly in relation to their impact on product quality and data integrity?\n\n3. What are the potential risks and areas that need to be checked for companies that fail to maintain an adequate inventory and risk assessment of their computerised systems as outlined in the \"Guidelines for Ensuring Compliance and Control of Computerised Systems for Data Integrity and Product Quality\"?", "prev_section_summary": "The section discusses the importance of system validation and maintenance in ensuring data management and integrity in regulated environments. It highlights the need for regulated companies to document and implement appropriate controls during system procurement and throughout the system lifecycle. Specific attention should be paid to GMP/GDP critical equipment, and legacy systems should be evaluated to determine if they meet data integrity controls. Additional controls may be necessary if system functionality or design does not provide adequate control for data integrity. The potential risks of not meeting data management and integrity requirements are also outlined, along with suggestions for additional controls such as using operating system functionality and configuring file/folder permissions.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Computerised Systems, Risk Assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nexpectation\n\nregulated users should have an inventory of all computerised systems in use. the list should include reference to:\n- the name, location and primary function of each computerised system;\n- assessments of pe function and criticality of pe system and associated data; (e.g. direct gmp/gdp impact, indirect impact, none)\n- the current validation status of each system and reference to existing validation documents.\n\nrisk assessments should be in place for each system, specifically assessing the necessary controls to ensure data integrity. the level and extent of validation of controls for data integrity should be determined based on the criticality of the system and process and potential risk to product quality, e.g. processes or systems that generate or control batch release data would generally require greater control than those systems managing less critical data or processes.\n\nconsideration should also be given to those systems with higher potential for disaster, malfunction or situations in which the system becomes inoperative.\n\nassessments should also review the vulnerability of the system to inadvertent or unauthorised changes to critical configuration settings or manipulation of data. all controls should be documented and their effectiveness verified.\n\npotential risk of not meeting expectations/items to be checked:\n\n- companies that do not have adequate visibility of all computerised systems in place may overlook the criticality of systems and may thus create vulnerabilities within the data lifecycle.\n- an inventory list serves to clearly communicate all systems in place and their criticality, ensuring that any changes or modifications to these systems are controlled.\n- verify that risk assessments are in place for critical processing equipment and data acquisition systems. a lack of thorough assessment of system impact may lead to a lack of appropriate validation and system control. examples of critical systems to review include:\n- systems used to control the purchasing and status of products and materials;\n- systems for the control and data acquisition for critical manufacturing processes;\n- systems that generate, store or process data that is used to determine batch quality;\n- systems that generate data that is included in the batch processing or packaging records;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a22f9db6-0acf-435a-b74e-93c1b630cc29": {"__data__": {"id_": "a22f9db6-0acf-435a-b74e-93c1b630cc29", "embedding": null, "metadata": {"page_label": "35", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Compliance: Computerised System Validation and Data Integrity\"", "questions_this_excerpt_can_answer": "1. What specific items must be included in a validation summary report for new computerised systems to ensure compliance with Annex 15 requirements in regulated environments, as outlined in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. How does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend handling the documentation and updating of validation and data integrity requirements for existing computerised systems in regulated environments?\n\n3. What are the key risk areas and items to be checked to ensure data integrity and compliance with GMP/GDP requirements as per the guidance provided in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, especially in relation to system configuration, segregation of duties, and user access management?", "prev_section_summary": "The section discusses the importance of having an inventory of all computerized systems in use within a regulated environment, including criteria such as system name, location, primary function, validation status, and risk assessments for data integrity. It emphasizes the need for controls to ensure data integrity based on the criticality of the system and potential risks to product quality. The section also highlights the potential risks of not maintaining an adequate inventory and risk assessment of computerized systems, such as overlooking critical systems and creating vulnerabilities in the data lifecycle. Key topics include system inventory, risk assessments, validation controls, critical systems, and potential risks of non-compliance. Key entities mentioned are regulated users, computerized systems, data integrity, product quality, validation documents, and critical processing equipment.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Computerised System Validation, GMP/GDP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### expectation\n\nfor new systems, a validation summary report for each computerised system (written and approved in accordance with annex 15 requirements) should be in place and state (or provide reference to) at least the following items:\n\n- critical system configuration details and controls for restricting access to configuration and any changes (change management).\n- a list of all currently approved normal and administrative users specifying the username and the role of the user.\n- frequency of review of audit trails and system logs.\n- procedures for:\n- creating new system user;\n- modifying or changing privileges for an existing user;\n- defining the combination or format of passwords for each system;\n- reviewing and deleting users;\n- back-up processes and frequency;\n- disaster recovery;\n- data archiving (processes and responsibilities), including procedures for accessing and reading archived data;\n- approving locations for data storage.\n- the report should explain how the original data are retained with relevant metadata in a form that permits the reconstruction of the manufacturing process or the analytical activity.\n\n### for existing systems\n\ndocuments specifying the above requirements should be available; however, need not be compiled into the validation summary report. these documents should be maintained and updated as necessary by the regulated user.\n\n### potential risk of not meeting expectations/items to be checked\n\n- check that validation systems and reports specifically address data integrity requirements following gmp/gdp requirements and considering alcoa principles.\n- system configuration and segregation of duties (e.g. authorization to generate data should be separate to authorization to verify data) should be defined prior to validation, and verified as effective during testing.\n- check the procedures for system access to ensure modifications or changes to systems are restricted and subject to change control management.\n- ensure that system administrator access is restricted to authorized persons and is not used for routine operations.\n- check the procedures for granting, modifying and removing access to computerized systems to ensure these activities are controlled. check the currency of user access logs and privilege levels, there should be no unauthorized users to the system and access accounts should be kept up to date.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f80e8a59-495a-4572-8f9f-771eedea8d18": {"__data__": {"id_": "f80e8a59-495a-4572-8f9f-771eedea8d18", "embedding": null, "metadata": {"page_label": "36", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Guidelines for Computerised System Validation and Data Integrity Requirements\"", "questions_this_excerpt_can_answer": "1. What specific steps should companies take to ensure the validation of computerised systems according to GMP Annex 15, and how should these steps be tailored to address data integrity risks?\n \n2. How does the document PI 041-1 guide companies in assessing the extent of validation required for computerised systems based on risk, and what reference does it provide for further guidance on validation requirements?\n\n3. What are the expectations regarding the inclusion of data integrity principles in validation documents and reports, and what potential risks are associated with unvalidated systems in terms of data integrity?", "prev_section_summary": "The section discusses the importance of ensuring compliance with Annex 15 requirements in regulated environments through the validation of computerized systems. It outlines specific items that must be included in a validation summary report for new systems, such as system configuration details, user access controls, audit trail review frequency, and data archiving procedures. For existing systems, documentation of these requirements should be maintained and updated by regulated users. Key risk areas include data integrity, system configuration, segregation of duties, user access management, and compliance with GMP/GDP requirements. The section emphasizes the need for thorough validation processes, effective change control management, and controlled access to computerized systems to prevent unauthorized access and ensure data integrity.", "excerpt_keywords": "Validation, Computerised Systems, Data Integrity, GMP Annex 15, Risk Assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nthere should also be restrictions to prevent users from amending audit trail functions and from changing any pre-defined directory paths where data files are to be stored.\n\n#### expectation\n\ncompanies should have a validation master plan in place that includes specific policies and validation requirements for computerised systems and the integrity of such systems and associated data.\n\nthe extent of validation for computerised systems should be determined based on risk. further guidance regarding assessing validation requirements for computerised systems may be found in pi 011.\n\nbefore a system is put into routine use, it should be challenged with defined tests for conformance with the acceptance criteria.\n\nit would be expected that a prospective validation for computerised systems is conducted. appropriate validation data should be available for systems already in-use.\n\ncomputerised system validation should be designed according to gmp annex 15 with urs, dq, fat, sat, iq, oq and pq tests as necessary.\n\nthe qualification testing approach should be tailored for the specific system under validation, and should be justified by the regulated user. qualification may include design qualification (dq); installation qualification (iq); operational qualification (oq); and performance qualification (pq). in particular, specific tests should be designed in order to challenge those areas where data quality or integrity is at risk.\n\ncompanies should ensure that computerised systems are qualified for their intended use. companies should therefore not place sole reliance on vendor qualification packages; validation exercises should include specific tests to ensure data integrity is maintained during operations that reflect normal and intended use.\n\nthe number of tests should be guided by a risk assessment but the critical functionalities should be at least identified and tested, e.g., certain plcs and systems based on basic algorithms or logic sets, the functional testing may provide adequate assurance of reliability of the computerised system.\n\nfor critical and/or more complex systems, detailed verification testing is required during iq, oq & pq stages.\n\npotential risk of not meeting expectations/items to be checked\n\n- check that validation documents include specific provisions for data integrity; validation reports should specifically address data integrity principles and demonstrate through design and testing that adequate controls are in place.\n- unvalidated systems may present a significant vulnerability regarding data integrity as user access and system configuration may allow data amendment.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c160d8b2-03b3-4231-a046-f55f948ffd7d": {"__data__": {"id_": "c160d8b2-03b3-4231-a046-f55f948ffd7d", "embedding": null, "metadata": {"page_label": "37", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Comprehensive Guide for Periodic Evaluation and Timely Updates of Computerised Systems and Network Components\"", "questions_this_excerpt_can_answer": "1. What specific criteria should be used to determine the frequency of periodic evaluations for computerised systems in regulated environments, according to the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. How does the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend handling operating systems and network components that have reached an unsupported state to ensure the continued integrity and management of data?\n\n3. According to the \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, what measures should be taken when applying security patches to operating systems and network components to maintain data security while adhering to change management principles?", "prev_section_summary": "The section discusses the guidelines for computerised system validation and data integrity requirements in regulated environments. Key topics include the importance of having a validation master plan, determining the extent of validation based on risk, conducting prospective validation for computerised systems, and the qualification testing approach. Entities mentioned include companies, computerised systems, validation master plan, validation requirements, validation data, GMP Annex 15, URS, DQ, FAT, SAT, IQ, OQ, PQ tests, vendor qualification packages, data integrity principles, and potential risks associated with unvalidated systems.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Computerised Systems, Periodic Evaluation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## expectation\n\nperiodic system evaluation\n\ncomputerised systems should be evaluated periodically in order to ensure continued compliance with respect to data integrity controls. the evaluation should include deviations, changes (including any cumulative effect of changes), upgrade history, performance and maintenance, and assess whether these changes have had any detrimental effect on data management and integrity controls.\n\nthe frequency of the re-evaluation should be based on a risk assessment depending on the criticality of the computerised systems considering the cumulative effect of changes to the system since the last review. the assessment performed should be documented.\n\npotential risk of not meeting expectations/items to be checked:\n\n- check that re-validation reviews for computerised systems are outlined within validation schedules.\n- verify that systems have been subject to periodic review, particularly with respect to any potential vulnerabilities regarding data integrity.\n- any issues identified, such as limitations of current software/hardware should be addressed in a timely manner and corrective and preventive actions, and interim controls should be available and implemented to manage any identified risks.\n\n## expectation\n\noperating systems and network components (including hardware) should be updated in a timely manner according to vendor recommendations and migration of applications from older to newer platforms should be planned and conducted in advance of the time before the platforms reach an unsupported state which may affect the management and integrity of data generated by the system.\n\nsecurity patches for operating systems and network components should be applied in a controlled and timely manner according to vendor recommendations in order to maintain data security. the application of security patches should be performed in accordance with change management principles.\n\nwhere unsupported operating systems are maintained, i.e. old operating systems are used even after they run out of support by the vendor or supported versions are not security patched, the systems (servers) should be isolated as much as possible from the rest of the network. remaining interfaces and data transfer to/from other equipment should be carefully designed, configured and qualified to prevent exploitation of the vulnerabilities caused by the unsupported operating system.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "561dc3c9-9014-42ad-820f-7edfbfc98e22": {"__data__": {"id_": "561dc3c9-9014-42ad-820f-7edfbfc98e22", "embedding": null, "metadata": {"page_label": "38", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Best Practices for Data Transfer, System Updates, and Data Integrity Management", "questions_this_excerpt_can_answer": "1. What specific measures are recommended to ensure the integrity and security of data during the transfer process in regulated environments, according to the 2021 guidelines in \"PI 041-1 Good Practices for Data Management and Integrity\"?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" address the risks associated with remote access to unsupported systems, and what are the recommended evaluations or controls to mitigate these risks?\n\n3. In the context of system updates and data migration within regulated environments, what are the expectations set forth by the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" for ensuring that existing and archived data remain accessible and intact post-update or migration?", "prev_section_summary": "The section discusses the importance of periodic evaluation of computerized systems in regulated environments to ensure data integrity controls are maintained. It emphasizes the need for risk-based assessments to determine the frequency of evaluations and highlights the potential risks of not meeting these expectations. Additionally, the section addresses the timely updating of operating systems and network components, including the application of security patches to maintain data security. It also mentions the need to plan for migration to newer platforms before reaching an unsupported state to prevent risks to data integrity. Lastly, it advises on isolating unsupported operating systems from the network and carefully designing interfaces to prevent vulnerabilities.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, System Updates, Data Transfer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### remote access to unsupported systems\n\nremote access to unsupported systems should be carefully evaluated due to inherent vulnerability risks.\n\n### potential risk of not meeting expectations/items to be checked\n\nverify pat system updates are performed in a controlled and timely manner. older systems should be reviewed critically to determine wheper appropriate data integrity controls are integrated, or, (where integrated controls are not possible) pat appropriate administrative controls have been implemented and are effective.\n\n### data transfer\n\nitem: data transfer and migration\n\n#### expectation\n\ninterfaces should be assessed and addressed during validation to ensure the correct and complete transfer of data. interfaces should include appropriate built-in checks for the correct and secure entry and processing of data, in order to minimize data integrity risks. verification methods may include the use of:\n\n- secure transfer\n- encryption\n- checksums\n\nwhere applicable, interfaces between systems should be designed and qualified to include an automated transfer of gmp/gdp data.\n\n### potential risk of not meeting expectations/items to be checked\n\ninterfaces between computerized systems present a risk whereby data may be inadvertently lost, amended, or transcribed incorrectly during pe transfer process.\nensure data is transferred directly to pe secure location/database and not simply copied from pe local drive (where it may have pe potential to be altered).\ntemporary data storage on local computerized systems (e.g. instrument computer) before transfer to final storage or data processing location creates an opportunity for data to be deleted or manipulated. this is a particular risk in pe case of standalone (non-networked) systems. ensure pe environment pat initially stores pe data has appropriate di controls in place.\nwell designed and qualified automated data transfer is much more reliable pan any manual data transfer conducted by humans.\n\n#### expectation\n\nwhere system software (including operating system) is installed or updated, the user should ensure that existing and archived data can be", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "53537c3f-4eef-4b6f-a8e8-9275a47d103e": {"__data__": {"id_": "53537c3f-4eef-4b6f-a8e8-9275a47d103e", "embedding": null, "metadata": {"page_label": "39", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Risk Assessment and Data Accessibility in Legacy Systems Software Maintenance: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. What strategies are recommended for preserving data accessibility when legacy systems software can no longer be supported, according to the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How does the document suggest handling the risk assessment and decision-making process when migrating legacy data to a new file format, especially when full original data functionality cannot be maintained?\n\n3. What specific measures are advised to ensure the integrity and control of legacy systems software when it is maintained in a virtual environment, as outlined in the document?", "prev_section_summary": "The section discusses best practices for data transfer, system updates, and data integrity management in regulated environments. It highlights the importance of evaluating remote access to unsupported systems, ensuring data integrity during transfer processes, and addressing potential risks associated with system updates and data migration. The document emphasizes the need for secure data transfer methods, encryption, and checksums to minimize data integrity risks. It also stresses the importance of automated data transfer and the critical review of older systems to ensure appropriate data integrity controls are in place. Additionally, the section emphasizes the need to transfer data directly to secure locations, avoid temporary storage on local systems, and ensure existing and archived data remain accessible post-update or migration.", "excerpt_keywords": "Legacy systems, Data accessibility, Risk assessment, Data integrity, Virtual environment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nwhen legacy systems software can no longer be supported, consideration should be given to maintaining the software for data accessibility purposes (for as long possible depending upon the specific retention requirements). this may be achieved by maintaining software in a virtual environment. migration to an alternative file format that retains as much as possible of the true copy attributes of the data may be necessary with increasing age of the legacy data. where migration with full original data functionality is not technically possible, options should be assessed based on risk and the importance of the data over time. the migration file format should be selected considering the balance of risk between long-term accessibility versus the possibility of reduced dynamic data functionality (e.g. data interrogation, trending, re-processing, etc.) the risk assessment should also review the vulnerability of the system to inadvertent or unauthorised changes to critical configuration settings or manipulation of data. all controls to mitigate risk should be documented and their effectiveness verified. it is recognised that the need to maintain accessibility may require migration to a file format that loses some attributes and/or dynamic data functionality.\n\npotential risk of not meeting expectations/items to be checked\n\n- when the software is maintained in a virtual environment, check that appropriate measures to control the software (e.g. validation status, access control by authorised persons, etc.) are in place. all controls should be documented and their effectiveness verified.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "05cfa466-54ca-46a4-bbb8-305120d4f123": {"__data__": {"id_": "05cfa466-54ca-46a4-bbb8-305120d4f123", "embedding": null, "metadata": {"page_label": "40", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "User Access Controls and Password Management in Computerised Systems: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures should be implemented in computerised systems to ensure individual accountability and prevent unauthorized access, as outlined in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" recommend handling user access roles and privileges in computerised systems to maintain system security and data integrity?\n\n3. According to the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, what are the guidelines for generating and maintaining user access logs in computerised systems, and how should these logs be utilized for security and compliance purposes?", "prev_section_summary": "The section discusses strategies for preserving data accessibility in legacy systems software when it can no longer be supported. It emphasizes the importance of maintaining software in a virtual environment and potentially migrating to an alternative file format to retain data attributes. The document suggests conducting a risk assessment when migrating data to a new format, considering the balance between long-term accessibility and reduced functionality. It also highlights the need for controls to mitigate risks such as unauthorized changes or data manipulation. The section emphasizes the importance of documenting and verifying the effectiveness of all controls in place.", "excerpt_keywords": "User Access Controls, Password Management, Computerised Systems, Data Integrity, System Security"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## system security for computerised systems\n\n|item|system security|\n|---|---|\n|1. expectation|user access controls shall be configured and enforced to prohibit unauthorised access to, changes to and deletion of data. the extent of security controls is dependent on the criticality of the computerised system. for example:|\n| |- individual login ids and passwords should be set up and assigned for all staff needing to access and utilise the specific electronic system. shared login credentials do not allow for traceability to the individual who performed the activity. for this reason, shared passwords, even for reasons of financial savings, should be prohibited. login parameters should be verified during validation of the electronic system to ensure that login profiles, configuration and password format are clearly defined and function as intended.|\n| |- input of data and changes to computerised records should be made only by authorised personnel. companies should maintain a list of authorised individuals and their access privileges for each electronic system in use.|\n| |- appropriate controls should be in place regarding the format and use of passwords, to ensure that systems are effectively secured.|\n| |- upon initially having been granted system access, a system should allow the user to create a new password, following the normal password rules.|\n| |- systems should support different user access roles (levels) and assignment of a role should follow the least-privilege rule, i.e. assigning the minimum necessary access level for any job function. as a minimum, simple systems should have normal and admin users, but complex systems will typically require more levels of users (e.g. a hierarchy) to effectively support access control.|\n| |- granting of administrator access rights to computerised systems and infrastructure used to run gmp/gdp critical applications should be strictly controlled. administrator access rights should not be given to normal users on the system (i.e. segregation of duties).|\n| |- normal users should not have access to critical aspects of the computerised system, e.g. system clocks, file deletion functions, etc.|\n| |- systems should be able to generate a list of users with actual access to the system, including user identification and roles. user lists should include the names or unique identifiers that permit identification of specific individuals. the list should be used during periodic user reviews.|\n| |- systems should be able to generate a list of successful and unsuccessful login attempts, including: - user identification\n- user access role\n- date and time of the attempted login, either in local time or traceable to local time\n- session length, in the case of successful logins\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a57744b0-46ad-46e2-ba7d-816a892c5078": {"__data__": {"id_": "a57744b0-46ad-46e2-ba7d-816a892c5078", "embedding": null, "metadata": {"page_label": "41", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "User Access Controls and Security Measures in Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific measures should be taken to ensure user access controls maintain strict segregation of duties within computerized systems in regulated environments, according to the 2021 guidelines?\n\n2. How does the document suggest handling system administrator roles in smaller organizations to maintain data integrity and security within their electronic systems?\n\n3. What are the recommended procedures for managing inactivity logouts in computerized systems to prevent unauthorized access, as outlined in the 2021 good practices guide?", "prev_section_summary": "The section discusses best practices for user access controls and password management in computerised systems to ensure system security and data integrity. Key topics include individual accountability, prohibition of unauthorized access, use of unique login IDs and passwords, authorization for data input and changes, password format controls, least-privilege rule for user access roles, segregation of duties for administrator access rights, restrictions on normal user access to critical system aspects, generation of user lists and access logs for security and compliance purposes. Key entities mentioned are user access controls, login IDs, passwords, authorized personnel, user access roles, administrator access rights, system security controls, user lists, and access logs.", "excerpt_keywords": "user access controls, system administrators, smaller organizations, authorization, computerized systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## user access controls\n\nuser access controls should ensure strict segregation of duties (i.e. that all users on a system who are conducting normal work tasks should have only normal access rights). normally, users with elevated access rights (e.g. admin) should not conduct normal work tasks on the system.\n\n## system administrators\n\nsystem administrators should normally be independent from users performing the task, and have no involvement or interest in the outcome of the data generated or available in the electronic system. for example, qc supervisors and managers should not be assigned as the system administrators for electronic systems in their laboratories (e.g. hplc, gc, uv-vis). typically, individuals outside of the quality and production organisations (e.g. information technology administrators) should serve as the system administrators and have enhanced permission levels.\n\n## smaller organizations\n\nfor smaller organizations, it may be permissible for a nominated person in the quality unit or production department to hold access as the system administrator; however, in these cases the administrator access should not be used for performing routine operations and the user should hold a second and restricted access for performing routine operations. in these cases all administrator activities conducted should be recorded and approved within the quality system.\n\n## authorization\n\nany request for new users, new privileges of users should be authorized by appropriate personnel (e.g. line manager and system owner) and forwarded to the system administrator in a traceable way in accordance with a standard procedure.\n\n## computerized systems\n\ncomputerized systems giving access to gmp/gdp critical data or operations should have an inactivity logout, which, either at the application or the operating system level, logs out a user who has been inactive longer than a predefined time. the time should be shorter, rather than longer and should typically be set to prevent unauthorized access to systems. upon activation of the inactivity logout, the system should require the user to go through the normal authentication procedure to login again.\n\n## potential risks\n\ncheck pat pe company has taken all reasonable steps to ensure pat pe computerized system in use is secured, and protected from deliberate or inadvertent changes.\nsystems pat are not physically and administratively secured are vulnerable to data integrity issues. inspectorates should confirm pat verified procedures exist pat manage system security, ensuring pat computerized systems are maintained in peir validated state and protected from manipulation.\ncheck pat individual user log-in ids are in use. where pe system configuration allows pe use of individual user log-in ids, pese should be used.\nit is acknowledged pat some legacy computerized systems support only a single user login or limited numbers of user logins. where no suitable alternative computerized system is available, equivalent control may be provided by pird party software, or a paper based mepod of providing traceability (wip version control). the suitability", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9f3bf483-edac-4661-9f21-8e17d77d7d23": {"__data__": {"id_": "9f3bf483-edac-4661-9f21-8e17d77d7d23", "embedding": null, "metadata": {"page_label": "42", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Security in Computerised Systems: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures are recommended for ensuring the integrity and security of passwords within computerised systems in regulated environments, according to the 2021 guidelines?\n \n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" address the issue of user access levels and the importance of authority checks in maintaining data integrity in computerised systems?\n\n3. What strategies are outlined in the document for protecting computerised systems from unauthorized changes and ensuring the physical security of system hardware, as part of maintaining data integrity in regulated environments?", "prev_section_summary": "The section discusses user access controls, system administrators, authorization procedures, inactivity logouts in computerized systems, potential risks related to system security, and the importance of maintaining data integrity. Key entities mentioned include segregation of duties, system administrators, quality unit, production department, line manager, system owner, inactivity logout, authentication procedures, computerized systems, and potential risks to data integrity. The section emphasizes the need for strict controls and procedures to ensure the security and integrity of data in regulated environments.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Computerised Systems, Security Measures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## of alternative systems should be justified and documented.\n\nincreased data review is likely to be required for hybrid systems.\n\n- inspectors should verify that a password policy is in place to ensure that systems enforce good password rules and require strong passwords. consideration should be made to using stronger passwords for systems generating or processing critical data.\n- systems where a new password cannot be changed by the user, but can only be created by the admin, are incompatible with data integrity, as the confidentiality of passwords cannot be maintained.\n- check that user access levels are appropriately defined, documented and controlled. the use of a single user access level on a system and assigning all users this role, which per definition will be the admin role, is not acceptable.\n- verify that the system uses authority checks to ensure that only authorised individuals can use the system, electronically sign a record, access the operation or computerised system input or output device, alter a record, or perform the operation at hand.\n\n## expectation\n\ncomputerised systems should be protected from accidental changes or deliberate manipulation. companies should assess systems and their design to prevent unauthorised changes to validated settings that may ultimately affect data integrity. consideration should be given to:\n\n- the physical security of computerised system hardware:\n- location of and access to servers;\n- restricting access to plc modules, e.g. by locking access panels.\n- physical access to computers, servers and media should be restricted to authorised individuals. users on a system should not normally have access to servers and media.\n- vulnerability of networked systems from local and external attack;\n- remote network updates, e.g. automated updating of networked systems by the vendor.\n- security of system settings, configurations and key data. access to critical data/operating parameters of systems should be appropriately restricted and any changes to settings/configuration controlled through change management processes by authorised personnel.\n- the operating system clock should be synchronized with the clock of connected systems and access to all clocks restricted to authorised personnel.\n- appropriate network security measures should be applied, including intrusion prevention and detection systems.\n- firewalls should be setup to protect critical data and operations. port openings (firewall rules) should be based on the least privilege policy, making the firewall rules as tight as possible and thereby allowing only permitting traffic.\n\nregulated users should conduct periodic reviews of the continued appropriateness and effectiveness of network security measures, (e.g. by the use of network vulnerability scans of the it infrastructure to identify", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d1b69d79-fbff-4402-83f0-822f80b23b48": {"__data__": {"id_": "d1b69d79-fbff-4402-83f0-822f80b23b48", "embedding": null, "metadata": {"page_label": "43", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Comprehensive Guide to Network Security and Access Control Best Practices", "questions_this_excerpt_can_answer": "1. What specific authentication methods are recommended for systems containing critical data accessible via the internet, according to the 2021 guidelines in \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments\"?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" suggest network system security should be assessed and what specific technologies are recommended to supplement firewalls for enhanced protection?\n\n3. According to the 2021 guidelines in \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments,\" what are the documented procedures for reviewing firewall rules, and what potential risks are associated with not periodically reviewing these rules?", "prev_section_summary": "The section discusses measures for ensuring data integrity and security in computerised systems in regulated environments. Key topics include password policies, user access levels, authority checks, physical security of system hardware, prevention of unauthorized changes, and network security measures. Entities mentioned include password rules, user roles, system administrators, authorized individuals, networked systems, system settings, configurations, key data, operating parameters, clocks, network security measures, firewalls, and network vulnerability scans.", "excerpt_keywords": "network security, data integrity, regulated environments, firewall rules, authentication methods"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n### potential security weaknesses) and ensure operating systems are maintained with current security measures.\n\npotential risk of not meeting expectations/items to be checked\n\ncheck pat access to hardware and software is appropriately secured, and restricted to auporised personnel.\nverify pat suitable aupentication mepods are implemented. these mepods should include user ids and passwords but oper mepods are possible and may be required. however, it is essential pat users are positively identifiable.\nfor remote aupentication to systems containing critical data available via pe internet; verify pat additional aupentication techniques are employed such as pe use of pass code tokens or biometrics.\nverify pat access to key operational parameters for systems is appropriately controlled and pat, where appropriate, systems enforce pe correct order of events and parameters in critical sequences of gmp/gdp steps.\n\n### expectation\n\nnetwork protection\n\nnetwork system security should include appropriate methods to detect and prevent potential threats to data.\n\nthe level of network protection implemented should be based on an assessment of data risk.\n\nfirewalls should be used to prevent unauthorised access, and their rules should be subject to periodic reviews against specifications in order to ensure that they are set as restrictive as necessary, allowing only permitted traffic. the reviews should be documented.\n\nfirewalls should be supplemented with appropriate virus-protection or intrusion prevention/detection systems to protect data and computerised systems from attempted attacks and malware.\n\npotential risk of not meeting expectations/items to be checked\n\ninadequate network security presents risks associated wip vulnerability of systems from unauporised access, misuse or modification.\ncheck pat appropriate measures to control network access are in place. processes should be in place for pe auporisation, monitoring and removal of access.\nsystems should be designed to prevent preats and detect attempted intrusions to pe network and pese measures should be installed, monitored and maintained.\nfirewall rules are typically subject to changes over time, e.g. temporary opening of ports due to maintenance on servers etc. if never reviewed, firewall rules may become obsolete permitting unwanted traffic or intrusions.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "10c1c554-e5b6-44e9-8d0f-feabfd7f9f1a": {"__data__": {"id_": "10c1c554-e5b6-44e9-8d0f-feabfd7f9f1a", "embedding": null, "metadata": {"page_label": "44", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Enhancing Security Measures for Electronic Signatures and USB Device Usage\"", "questions_this_excerpt_can_answer": "1. What specific controls are recommended to ensure the authenticity and traceability of electronic signatures in regulated environments, according to the document \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\"?\n\n2. How does the document \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" suggest handling changes to data that have already been signed with an electronic signature to maintain data integrity?\n\n3. What measures does the document \"Good Practices for Data Management and Integrity in Regulated Environments (2021)\" recommend for preventing security breaches related to the use of USB devices in environments handling GMP/GDP critical data?", "prev_section_summary": "The section discusses the importance of network security and access control best practices in regulated environments, as outlined in the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).\" Key topics include authentication methods for systems containing critical data accessible via the internet, assessment of network system security, recommended technologies to supplement firewalls, reviewing firewall rules, and potential risks associated with inadequate network security. Entities mentioned include user ids, passwords, pass code tokens, biometrics, firewalls, virus-protection systems, intrusion prevention/detection systems, and unauthorized access.", "excerpt_keywords": "Electronic Signatures, Data Management, Data Integrity, USB Devices, Security Measures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n#### 4. electronic signatures\n\nelectronic signatures used in the place of handwritten signatures should have appropriate controls to ensure their authenticity and traceability to the specific person who electronically signed the record(s). electronic signatures should be permanently linked to their respective record, i.e. if a later change is made to a signed record; the record should indicate the amendment and appear as unsigned. where used, electronic signature functionality should automatically log the date and time when a signature was applied. the use of advanced forms of electronic signatures is becoming more common (e.g. the use of biometrics is becoming more prevalent by firms). the use of advanced forms of electronic signatures should be encouraged.\n\npotential risk of not meeting expectations/items to be checked:\n\n- check that electronic signatures are appropriately validated, their issue to staff is controlled and that at all times, electronic signatures are readily attributable to an individual.\n- any changes to data after an electronic signature has been assigned should invalidate the signature until the data has been reviewed again and re-signed.\n\n#### 5. restrictions on use of usb devices\n\nfor reasons of system security, computerized systems should be configured to prevent vulnerabilities from the use of usb memory sticks and storage devices on computer clients and servers hosting gmp/gdp critical data. if necessary, ports should only be opened for approved purposes and all usb devices should be properly scanned before use. the use of private usb devices (flash drives, cameras, smartphones, keyboards, etc.) on company computer clients and servers hosting gmp/gdp data, or the use of company usb devices on private computers, should be controlled in order to prevent security breaches.\n\npotential risk of not meeting expectations/items to be checked:\n\n- this is especially important where operating system vulnerabilities are known that allow usb devices to trick the computer, by pretending to be another external device, e.g. keyboard, and can contain and start executable code.\n- controls should be in place to restrict the use of such devices to authorized users and measures to screen usb devices before use should be in place.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8ce39848-39a8-4b03-bc28-70af829e635f": {"__data__": {"id_": "8ce39848-39a8-4b03-bc28-70af829e635f", "embedding": null, "metadata": {"page_label": "45", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Best Practices for Establishing and Maintaining Effective Audit Trails in Computerised Systems\"", "questions_this_excerpt_can_answer": "1. What considerations should companies have regarding audit trail functionality when purchasing and implementing computerised systems for data management and integrity?\n \n2. How should regulated users approach the validation and qualification of audit trail functionalities in computerised systems to ensure compliance with ALCOA+ principles and GMP/GDP relevance?\n\n3. What specific guidance does the document provide on managing systems that allow administrative users to deactivate, delete, or modify audit trail functionality, and how should companies document such occurrences?", "prev_section_summary": "The section discusses the importance of electronic signatures in regulated environments, emphasizing the need for controls to ensure authenticity and traceability. It also highlights the risks associated with changes to signed records and the necessity of re-signing after any modifications. Additionally, the section addresses the restrictions on the use of USB devices to prevent security breaches, recommending controls to limit access to approved purposes and scanning devices before use. The key topics include electronic signatures, data integrity, USB device security, and risk mitigation measures. Key entities mentioned are electronic signatures, signed records, USB devices, and security controls.", "excerpt_keywords": "Data Management, Data Integrity, Audit Trails, Computerised Systems, Electronic Signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## audit trails for computerised systems\n\n|item|audit trails|\n|---|---|\n|1. expectation|consideration should be given to data management and integrity requirements when purchasing and implementing computerised systems. companies should select software that includes appropriate electronic audit trail functionality. companies should endeavor to purchase and upgrade older systems to implement software that includes electronic audit trail functionality. it is acknowledged that some very simple systems lack appropriate audit trails; however, alternative arrangements to verify the veracity of data should be implemented, e.g. administrative procedures, secondary checks and controls. additional guidance may be found under section 9.10 regarding hybrid systems.|\n| |audit trail functionality should be verified during validation of the system to ensure that all changes and deletions of critical data associated with each manual activity are recorded and meet alcoa+ principles. regulated users should understand the nature and function of audit trails within systems, and should perform an assessment of the different audit trails during qualification to determine the gmp/gdp relevance of each audit trail, and to ensure the correct management and configuration of audit trails for critical and gmp/gdp relevant data. this exercise is important in determining which specific trails and which entries within trails are of significance for review with a defined frequency established. for example, following such an assessment audit trail reviews may focus on:|\n| |- identifying and reviewing entries/data that relate to changes or modification of data.|\n| |- review by exception - focusing on anomalous or unauthorized activities.|\n| |- systems with limitations that allow change of parameters/data or where activities are left open to modification|\n| |- note: well-designed systems with permission settings that prevent change of parameters/data or have access restrictions that prevent changes to configuration settings may negate the need to examine related audit trails in detail|\n| |audit trail functionalities should be enabled and locked at all times and it should not be possible to deactivate, delete or modify the functionality. if it is possible for administrative users to deactivate, delete or modify the audit trail functionality, an automatic entry should be made in the audit trail indicating that this has occurred. companies should implement procedures that outline their policy and processes to determine the data that is required in audit trails, and the review of audit trails in accordance with risk management principles. critical|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a1e9b433-0f86-4472-af86-975ca87dad12": {"__data__": {"id_": "a1e9b433-0f86-4472-af86-975ca87dad12", "embedding": null, "metadata": {"page_label": "46", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Audit Trail Review and Compliance Expectations for Electronic Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific steps should be taken to ensure the integrity of audit trails in regulated environments according to the PI 041-1 Good Practices for Data Management and Integrity document from 2021?\n \n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" address the handling and review of non-critical audit trails in comparison to critical audit trails within electronic systems?\n\n3. What are the detailed requirements for configuring audit trail functionalities in electronic-based systems as outlined in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, especially regarding the capture of critical activities and changes to data?", "prev_section_summary": "The section discusses the importance of audit trails for computerized systems in maintaining data management and integrity in regulated environments. Key topics include considerations for selecting software with electronic audit trail functionality, verifying audit trail functionality during system validation, understanding the nature and function of audit trails, assessing the relevance of audit trails for GMP/GDP compliance, and ensuring audit trail functionalities are enabled and locked at all times. The section also addresses the need for procedures to document and review audit trails, especially in cases where administrative users have the ability to deactivate, delete, or modify audit trail functionality.", "excerpt_keywords": "Audit Trail, Data Management, Integrity, Regulated Environments, Electronic Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## audit trails related to each operation should be independently reviewed\n\nwith all other records related to the operation and prior to the review of the completion of the operation (e.g. prior to batch release) so as to ensure that critical data and changes to it are acceptable. this review should be performed by the originating department, and where necessary verified by the quality unit, e.g. during self-inspection or investigative activities. non-critical audit trails reviews can be conducted during system reviews at a pre-defined frequency. this review should be performed by the originating department, and where necessary verified by the quality unit (e.g. during batch release, self-inspection or investigative activities).\n\n### potential risk of not meeting expectations/items to be checked\n\nvalidation documentation should demonstrate pat audit trails are functional, and pat all activities, changes and oper transactions wipin pe systems are recorded, togeper wip all relevant metadata.\nverify pat audit trails are regularly reviewed (in accordance wip quality risk management principles) and pat discrepancies are investigated.\nif no electronic audit trail system exists a paper based record to demonstrate changes to data may be acceptable until a fully audit trailed (integrated system or independent audit software using a validated interface) system becomes available. these hybrid systems are permitted, where pey achieve equivalence to integrated audit trail, such as described in annex 11 of pe pic/s gmp guide.\nfailure to adequately review audit trails may allow manipulated or erroneous data to be inadvertently accepted by pe quality unit and/or auporised person.\nclear details of which data are critical, and which changes and deletions should be recorded (audit trail) should be documented.\n\n## expectation\n\nwhere available, audit trail functionalities for electronic-based systems should be assessed and configured properly to capture any critical activities relating to the acquisition, deletion, overwriting of and changes to data for audit purposes. audit trails should be configured to record all manually initiated processes related to critical data. the system should provide a secure, computer-generated, time-stamped audit trail to independently record the date and time of entries and actions that create, modify, or delete electronic records. the audit trail should include the following parameters:\n\n- details of the user that undertook the action;\n- what action occurred, was changed, incl. old and new values;\n- when the action was taken, incl. date and time;\n- why the action was taken (reason); and", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7808f342-5d85-47f8-8af5-f939f8ba91da": {"__data__": {"id_": "7808f342-5d85-47f8-8af5-f939f8ba91da", "embedding": null, "metadata": {"page_label": "47", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Audit Trail and Data Capture/Entry Compliance in Computerised Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific functionalities should an audit trail in a computerised system retain to ensure data integrity and compliance with regulatory expectations, as outlined in the PI 041-1 guidelines from 2021?\n \n2. According to the PI 041-1 guidelines, how should a computerised system handle the recording of critical process parameters (CPPs) like the order of addition of raw materials to ensure data integrity and compliance?\n\n3. What are the requirements for manual data entry into computerised systems as specified in the PI 041-1 guidelines, particularly regarding the authorization and documentation of data entries made by individuals?", "prev_section_summary": "The section discusses the importance of independently reviewing audit trails related to each operation in regulated environments to ensure the integrity of critical data and changes. It outlines the specific steps that should be taken to review audit trails, including the involvement of the originating department and the quality unit. The section also highlights the potential risks of not meeting audit trail expectations and the importance of validating and regularly reviewing audit trails. Additionally, it emphasizes the need for properly configuring audit trail functionalities in electronic-based systems to capture critical activities and changes to data for audit purposes. Key entities mentioned include validation documentation, electronic audit trail systems, critical data, and audit trail parameters such as user details, actions taken, timestamps, and reasons for actions.", "excerpt_keywords": "Keywords: Audit Trail, Data Integrity, Compliance, Computerised Systems, Manual Data Entry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## - in the case of changes or modifications to data, the name of any person authorising the change.\n\nthe audit trail should allow for reconstruction of the course of events relating to the creation, modification, or deletion of an electronic record. the system should be able to print and provide an electronic copy of the audit trail, and whether viewing in the system online or in a hardcopy, the audit trail should be available in a meaningful format. if possible, the audit trail should retain the dynamic functionalities found in the computerised system, (e.g. search functionality and ability to export data such as to a spreadsheet).\n\nnote: an audit trail should not be confused with a change control system where changes may needed to appropriately controlled and approved under a pqs.\n\npotential risk of not meeting expectations/items to be checked\n\n- verify the format of audit trails to ensure that all critical and relevant information is captured.\n- the audit trail should include all previous values and record changes should not overwrite or obscure previously recorded information.\n- audit trail entries should be recorded in true time and reflect the actual time of activities. systems recording the same time for a number of sequential interactions, or which only make an entry in the audit trail, once all interactions have been completed, may not be in compliance with expectations to data integrity, particularly where each discrete interaction or sequence is critical, e.g. for the electronic recording of addition of 4 raw materials to a mixing vessel. if the order of addition is a critical process parameter (cpp), then each addition should be recorded individually, with time stamps. if the order of addition is not a cpp then the addition of all 4 materials could be recorded as a single timestamped activity.\n\n## 9.7 data capture/entry for computerised systems\n\n|item:|data capture/entry|\n|---|---|\n|expectation|systems should be designed for the correct capture of data whether acquired through manual or automated means.|\n\nfor manual entry:\n\n- the entry of critical data should only be made by authorised individuals and the system should record details of the entry, the individual making the entry and when the entry was made.\n\npi 041-1 47 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c892b423-69ae-471d-aa2d-bdf0e232e5b4": {"__data__": {"id_": "c892b423-69ae-471d-aa2d-bdf0e232e5b4", "embedding": null, "metadata": {"page_label": "48", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Entry, Verification, and Integrity Controls in Automated Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific measures are recommended for ensuring the integrity and verification of manual data entries in regulated environments, according to the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" suggest automated data capture systems should be validated to prevent data manipulation, loss, or change?\n\n3. What procedures does the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document recommend for controlling and documenting any necessary changes or modifications to raw data in a regulated environment?", "prev_section_summary": "This section discusses the importance of audit trails in computerized systems to ensure data integrity and compliance with regulatory expectations. It outlines the specific functionalities that an audit trail should retain, such as the ability to reconstruct events, provide meaningful formats for viewing, and retain dynamic functionalities. The section also highlights the potential risks of not meeting audit trail expectations, such as verifying the format of audit trails, recording changes in true time, and ensuring all critical information is captured. Additionally, it addresses the requirements for manual data entry into computerized systems, emphasizing that critical data should only be entered by authorized individuals and that the system should record details of the entry, the individual making the entry, and when the entry was made.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Automated Systems, Audit Trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n- data should be entered in a specified format that is controlled by the software, validation activities should verify that invalid data formats are not accepted by the system.\n\n- all manual data entries of critical data should be verified, either by a second operator, or by a validated computerised means.\n\n- changes to entries should be captured in the audit trail and reviewed by an appropriately authorised and independent person.\n\nfor automated data capture: (refer also to table 9.3)\n\n- the interface between the originating system, data acquisition and recording systems should be validated to ensure the accuracy of data.\n\n- data captured by the system should be saved into memory in a format that is not vulnerable to manipulation, loss or change.\n\n- the system software should incorporate validated checks to ensure the completeness of data acquired, as well as any relevant metadata associated with the data.\n\npotential risk of not meeting expectations/items to be checked\n\n- ensure that manual entries of critical data made into computerised systems are subject to an appropriate secondary check.\n- validation records should be reviewed for systems using automated data capture to ensure that data verification and integrity measures are implemented and effective, e.g. verify whether an auto save function was validated and, therefore, users have no ability to disable it and potentially generate unreported data.\n\nexpectation\n\nany necessary changes to data should be authorised and controlled in accordance with approved procedures. for example, manual integrations and reprocessing of laboratory results should be performed in an approved and controlled manner. the firms quality unit should establish measures to ensure that changes to data are performed only when necessary and by designated individuals. original (unchanged) data should be retained in its original context. any and all changes and modifications to raw data should be fully documented and should be reviewed and approved by at least one appropriately trained and qualified individual.\n\npotential risk of not meeting expectations/items to be checked\n\n- verify that appropriate procedures exist to control any amendments or re-processing of data. evidence should demonstrate an appropriate process of formal approval for the proposed change, controlled/restricted/defined changes and formal review of the changes made.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8ebc4103-913a-4ff6-99f7-a605b7d1d042": {"__data__": {"id_": "8ebc4103-913a-4ff6-99f7-a605b7d1d042", "embedding": null, "metadata": {"page_label": "49", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Review and Verification of Electronic Data in Computerised Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific steps should a regulated user take to ensure the integrity and correctness of electronic data within computerised systems according to the PI 041-1 guidelines?\n \n2. How does the PI 041-1 document recommend handling significant variations found during the audit trail review of electronic data in regulated environments?\n\n3. According to the PI 041-1 guidelines, what criteria should be used to determine the frequency, roles, and responsibilities for reviewing audit trails of electronic data in GMP/GDP relevant computerised systems?", "prev_section_summary": "The section discusses measures for ensuring data integrity and verification in regulated environments, including the importance of entering data in a specified format, verifying manual data entries, capturing changes in an audit trail, and validating automated data capture systems. It emphasizes the need for validation of data capture interfaces, saving data in a secure format, incorporating checks for data completeness, and controlling changes to raw data. The document recommends procedures for controlling and documenting changes to data, including formal approval processes and review by qualified individuals. Key entities include data entry, verification, validation activities, audit trails, automated data capture systems, data integrity measures, and procedures for controlling data changes.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Electronic Data, Audit Trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## review of data within computerised systems\n\nitem:\nreview of electronic data\n\nexpectation\n\nthe regulated user should perform a risk assessment in order to identify all the gmp/gdp relevant electronic data generated by the computerised systems, and the criticality of the data. once identified, critical data should be audited by the regulated user and verified to determine that operations were performed correctly and whether any change (modification, deletion or overwriting) have been made to original information in electronic records, or whether any relevant unreported data was generated. all changes should be duly authorised.\n\nan sop should describe the process by which data is checked by a second operator. these sops should outline the critical raw data that is reviewed, a review of data summaries, review of any associated log-books and hard-copy records, and explain how the review is performed, recorded and authorised.\n\nthe review of audit trails should be part of the routine data review within the approval process.\n\nthe frequency, roles and responsibilities of audit trail review should be based on a risk assessment according to the gmp/gdp relevant value of the data recorded in the computerised system. for example, for changes of electronic data that can have a direct impact on the quality of the medicinal products, it would be expected to review audit trails prior to the point that the data is relied upon to make a critical decision, e.g. batch release.\n\nthe regulated user should establish an sop that describes in detail how to review audit trails, what to look for and how to perform searches etc. the procedure should determine in detail the process that the person in charge of the audit trail review should follow. the audit trail review activity should be documented and recorded.\n\nany significant variation from the expected outcome found during the audit trail review should be fully investigated and recorded. a procedure should describe the actions to be taken if a review of audit trails identifies serious issues that can impact the quality of the medicinal products or the integrity of data.\n\npotential risk of not meeting expectations/items to be checked\n\n- check local procedures to ensure that electronic data is reviewed based on its criticality (impact to product quality and/or decision making). evidence of each review should be recorded and available to the inspector.\n- where data summaries are used for internal or external reporting, evidence should be available to demonstrate that such summaries have been verified in accordance with raw data.\n\npi 041-1 49 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "27b21ecc-905e-4d2a-ba48-91887737886b": {"__data__": {"id_": "27b21ecc-905e-4d2a-ba48-91887737886b", "embedding": null, "metadata": {"page_label": "50", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Guidelines for Audit Trail Reviews, Quality Unit Responsibilities, and Storage of Electronic Data", "questions_this_excerpt_can_answer": "1. What specific steps should a regulated party take according to their SOP for performing secondary reviews and audit trail reviews, and how should they respond if issues are identified during these reviews?\n \n2. How does the document recommend handling the storage, archival, and disposal of electronic data to ensure the integrity and security of the data, including backups and copies?\n\n3. What are the responsibilities of a company's quality unit in establishing a program for ongoing audit trail reviews, and how should discrepancies found during these reviews be addressed, including the escalation process for notifying senior management and national authorities?", "prev_section_summary": "The section discusses the review and verification of electronic data within computerized systems in regulated environments according to the PI 041-1 guidelines. Key topics include performing a risk assessment to identify critical data, auditing and verifying data for correctness and integrity, reviewing audit trails as part of routine data review, determining frequency and roles for audit trail review based on risk assessment, establishing SOPs for audit trail review, investigating and recording significant variations found during audit trail review, and potential risks of not meeting expectations related to data review and verification. Key entities mentioned include regulated users, critical data, audit trails, SOPs, risk assessment, and inspection requirements.", "excerpt_keywords": "Keywords: audit trail reviews, quality unit responsibilities, electronic data storage, data integrity, SOPs"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## checklist for audit trail reviews\n\n- check that the regulated party has a detailed sop outlining the steps on how to perform secondary reviews and audit trail reviews and what steps to take if issues are found during the course of the review.\n- where global systems are used, it may be necessary for date and time records to include a record of the time zone to demonstrate contemporaneous recording.\n- check that known changes, modifications or deletions of data are actually recorded by the audit trail functionality.\n\n## companys quality unit responsibilities\n\nthe companys quality unit should establish a program and schedule to conduct ongoing reviews of audit trails based upon their criticality and the systems complexity in order to verify the effective implementation of current controls and to detect potential non-compliance issues. these reviews should be incorporated into the companys self-inspection programme. procedures should be in place to address and investigate any audit trail discrepancies, including escalation processes for the notification of senior management and national authorities where necessary.\n\n### potential risk of not meeting expectations/items to be checked\n\n- verify that self-inspection programs incorporate checks of audit trails, with the intent to verify the effectiveness of existing controls and compliance with internal procedures regarding the review of data.\n- audit trail reviews should be both random (selected based on chance) and targeted (selected based on criticality or risk).\n\n## storage, archival, and disposal of electronic data\n\n|item|storage, archival and disposal of electronic data|\n|---|---|\n|expectation|storage of data should include the entire original data and all relevant metadata, including audit trails, using a secure and validated process. if the data is backed up, or copies of it are made, then the backup and copies should also have the same appropriate levels of controls so as to prohibit unauthorized access to, changes to and deletion of data or their alteration. for example, a firm that backs up data onto portable hard drives should prohibit the ability to delete data from the hard drive. some additional considerations for the storage and backup of data include:|\n|- true copies of dynamic electronic records can be made, with the expectation that the entire content (i.e. all data and all relevant metadata is included) and meaning of the original records are preserved.| |\n|- stored data should be accessible in a fully readable format. companies may need to maintain suitable software and hardware| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "beae4628-e705-4ce6-a336-cda7042fd221": {"__data__": {"id_": "beae4628-e705-4ce6-a336-cda7042fd221", "embedding": null, "metadata": {"page_label": "51", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Backup and Archiving Policies and Procedures", "questions_this_excerpt_can_answer": "1. What specific measures are recommended for ensuring the readability of backup data throughout the entire regulatory retention period, especially in the context of software updates?\n \n2. How does the document suggest managing the risk associated with losing access to archived data due to software updates or equipment changes?\n\n3. What are the recommended practices for maintaining the integrity and accessibility of data during the archiving period, including environmental controls and access restrictions?", "prev_section_summary": "This section discusses guidelines for audit trail reviews, quality unit responsibilities, and the storage of electronic data in regulated environments. Key topics include the checklist for audit trail reviews, the responsibilities of the company's quality unit in conducting ongoing reviews, and the storage, archival, and disposal of electronic data. The section emphasizes the importance of having detailed SOPs for performing reviews, ensuring the integrity and security of data storage, and addressing discrepancies found during audit trail reviews. It also highlights the need for escalation processes for notifying senior management and national authorities in case of non-compliance issues.", "excerpt_keywords": "Data Management, Data Integrity, Regulatory Compliance, Backup Procedures, Archiving Policies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## expectations for data backup and archiving\n\nto access electronically stored data backups or copies during the retention period:\n\n- routine backup copies should be stored in a remote location (physically separated) in the event of disasters.\n- back-up data should be readable for the entire regulatory retention period, even if a new software version has been updated.\n- systems should allow backup and restoration of all data, including meta-data and audit trails.\n\n### potential risks of not meeting expectations/items to be checked:\n\n- check that data storage, backup, and archival systems are designed to capture all data and relevant metadata with documented validation.\n- ensure that metadata captured is based on risk management principles and critical for the reconstruction of activities or processes.\n- check that data associated with superseded or upgraded systems is managed appropriately and accessible.\n\n## expectations\n\nthe record retention procedures should include provisions for retaining metadata to allow for future queries or investigations to reconstruct related activities.\n\n## data backup and archiving procedures\n\ndata should be backed up periodically and archived according to written procedures. archive copies should be secured in a separate and remote location from the original data.\n\nthe data should be accessible, readable, and its integrity maintained throughout the archiving period.\n\nprocedures for restoring archived data in case of investigations should be in place and regularly tested.\n\nfacilities for archiving should have specific environmental controls and restricted access to ensure data protection from alteration or loss.\n\nwhen retiring a system due to long-term data access issues, procedures should assure the continued readability of the archived data, such as transferring it to another system.\n\n### potential risks of not meeting expectations/items to be checked:\n\n- there is a risk of losing access and readability of archived data due to software updates or equipment changes. verify that the company has access to archived data.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "93619378-84b1-48ca-b38a-d682d4860faf": {"__data__": {"id_": "93619378-84b1-48ca-b38a-d682d4860faf", "embedding": null, "metadata": {"page_label": "52", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Management and Control of Hybrid Systems.", "questions_this_excerpt_can_answer": "1. What are the recommended controls for managing hybrid systems in regulated environments according to the PI 041-1 Good Practices for Data Management and Integrity document from 2021?\n \n2. Why does the PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021) discourage the use of hybrid systems, and what are the suggested actions for existing hybrid systems?\n\n3. How does the PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021) document suggest qualifying and controlling each element of a hybrid system?", "prev_section_summary": "The section discusses the expectations and procedures for data backup and archiving in regulated environments. Key topics include storing backup copies in remote locations, ensuring data readability throughout the retention period, capturing all data and metadata, retaining metadata for future queries, securing archive copies in separate locations, maintaining data integrity, and having procedures for restoring archived data. Entities mentioned include data storage systems, metadata, data accessibility, environmental controls, restricted access, and system retirement procedures.", "excerpt_keywords": "Data Management, Data Integrity, Regulated Environments, Hybrid Systems, Control"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n#### management of hybrid systems\n\n|item:|management of hybrid systems|\n|---|---|\n|1.|hybrid systems require specific and additional controls in reflection of their complexity and potential increased vulnerability to manipulation of data. for this reason, the use of hybrid systems is discouraged and such systems should be replaced whenever possible. each element of the hybrid system should be qualified and controlled in accordance with the guidance relating to manual and computerised systems as specified above.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bbe413ba-4f7d-40cc-892d-267beb000531": {"__data__": {"id_": "bbe413ba-4f7d-40cc-892d-267beb000531", "embedding": null, "metadata": {"page_label": "53", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Comprehensive Title: Quality Risk Management Principles for Hybrid Systems in Data Management: A Guide for Effective Risk Mitigation and Compliance", "questions_this_excerpt_can_answer": "1. What specific procedures should be implemented to manage the interface between manual and automated systems within a hybrid system to ensure data integrity and management?\n \n2. How should a detailed system description for a hybrid system be structured to effectively outline its components, their functions, and the controls for data management and integrity?\n\n3. What are the key elements that need to be verified by inspectors to ensure that hybrid systems meet regulatory expectations for data management and integrity, particularly in relation to manual and computerized system interfaces?", "prev_section_summary": "The section discusses the management of hybrid systems in regulated environments, highlighting the need for specific and additional controls due to their complexity and potential vulnerability to data manipulation. The document discourages the use of hybrid systems and recommends replacing them whenever possible. It suggests qualifying and controlling each element of the hybrid system in accordance with guidance for manual and computerized systems.", "excerpt_keywords": "Quality Risk Management, Hybrid Systems, Data Management, Data Integrity, Regulatory Expectations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nappropriate quality risk management principles should be followed when assessing, defining, and demonstrating the effectiveness of control measures applied to the system.\n\na detailed system description of the entire system should be available that outlines all major components of the system, the function of each component, controls for data management and integrity, and the manner in which system components interact.\n\nprocedures and records should be available to manage and appropriately control the interface between manual and automated systems, particularly steps associated with:\n\n- manual input of manually generated data into computerized systems;\n- transcription (including manual) of data generated by automated systems onto paper records; and\n- automated detection and transcription of printed data into computerized systems.\n\npotential risk of not meeting expectations/items to be checked:\n\n- check that hybrid systems are clearly defined and identified, and that each contributing element of the system is validated.\n- attention should be paid to the interface between the manual and computerized system. inspectors should verify that adequate controls and secondary checks are in place where manual transcription between systems takes place.\n- original data should be retained following transcription and processing.\n- hybrid systems commonly consist of a combination of computerized and manual systems. particular attention should be paid to verifying:\n- the extent of qualification and/or validation of the computerized system; and,\n- the robustness of controls applied to the management of the manual element of the hybrid system due to the difficulties in consistent application of a manual process.\n\nprocedures should be in place to manage the review of data generated by hybrid systems which clearly outline the process for the evaluation and approval of electronic and paper-based data. procedures should outline:\n\n- instructions for how electronic data and paper-based data is correlated to form a complete record.\n- expectations for approval of data outputs for each system.\n- risks identified with hybrid systems, with a focus on verification of the effective application of controls.\n\npotential risk of not meeting expectations/items to be checked:\n\n- verify that instructions for the review of hybrid system data is in place.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "37a5e373-9797-4790-8c69-9edfea623658": {"__data__": {"id_": "37a5e373-9797-4790-8c69-9edfea623658", "embedding": null, "metadata": {"page_label": "54", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in Outsourced Supply Chains: Key Considerations and Best Practices", "questions_this_excerpt_can_answer": "1. What are the key components that modern supply chains in the pharmaceutical industry typically involve, as outlined in the PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021) document?\n\n2. According to the PI 041-1 guidance document, what are the recommended strategies for assessing data integrity within outsourced supply chains in the pharmaceutical sector?\n\n3. How does the PI 041-1 document suggest organizations should handle the limitations of data integrity when dealing with information obtained from the supply chain, such as summary records and copies/printouts?", "prev_section_summary": "The section discusses the importance of quality risk management principles in assessing and controlling hybrid systems for data management and integrity in regulated environments. It emphasizes the need for a detailed system description outlining components, functions, and controls, as well as procedures for managing the interface between manual and automated systems. Key elements to be verified by inspectors include system definition, validation, controls for manual-computerized system interface, retention of original data, qualification/validation of computerized systems, and review/approval processes for electronic and paper-based data. The potential risks of not meeting expectations are highlighted throughout the excerpt.", "excerpt_keywords": "Data Integrity, Outsourced Supply Chains, Pharmaceutical Industry, Risk Management, Data Governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## data integrity considerations for outsourced activities\n\n|10|data integrity considerations for outsourced activities|\n|---|---|\n|10.1|general supply chain considerations|\n|10.1.1|modern supply chains often consist of multiple partner companies working together to ensure safe and continued supply of medicinal products. typical supply chains require the involvement of api producers, dosage form manufacturers, analytical laboratories, wholesale and distribution organisations, often from differing organisations and locations. these supply chains are often supported by additional organisations, providing outsourced services, it services and infrastructure, expertise or consulting services.|\n|10.1.2|data integrity plays a key part in ensuring the security and integrity of supply chains. data governance measures by a contract giver may be significantly weakened by unreliable or falsified data or materials provided by supply chain partners. this principle applies to all outsourced activities, including suppliers of raw materials, contract manufacturers, analytical services, wholesalers, contracted service providers and consultants.|\n|10.1.3|initial and periodic re-qualification of supply chain partners and outsourced activities should include consideration of data integrity risks and appropriate control measures.|\n|10.1.4|it is important for an organisation to understand the data integrity limitations of information obtained from the supply chain (e.g. summary records and copies / printouts) and the challenges of remote supervision. these limitations are similar to those discussed in section 8.11 of this guidance. this will help to focus resources towards data integrity verification and supervision using a quality risk management approach.|\n|10.2|routine document verification|\n|10.2.1|the supply chain relies upon the use of documentation and data passed from one organisation to another. it is often not practical for the contract giver to review all raw data relating to reported results. emphasis should be placed upon a robust qualification process for outsourced supplier and contractor, using quality risk management principles.|\n|10.3|strategies for assessing data integrity in the supply chain|\n|10.3.1|companies should conduct regular risk reviews of supply chains and outsourced activity that evaluate the extent of data integrity controls required. the frequency of such reviews should be based on the criticality of the services provided by the contract acceptor, using risk management principles. information considered during risk reviews may include:|\n\nthe outcome of site audits, wip focus on data governance measures\ndemonstrated compliance wip international standards or guidelines related to data integrity and security\nreview of data submitted in routine reports, for example:\n\npi 041-1 54 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "02c4bc73-9842-49a3-861b-da9446747a81": {"__data__": {"id_": "02c4bc73-9842-49a3-861b-da9446747a81", "embedding": null, "metadata": {"page_label": "55", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity in Supply Chain Relationships: The Role of Quality Agreements and Audits\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended in PI 041-1 for ensuring data integrity in the relationship between manufacturers and suppliers, particularly in the context of quality agreements?\n \n2. How does PI 041-1 suggest manufacturers should conduct audits of suppliers and contract manufacturers to verify data integrity measures, and what expectations are set for contract acceptors during these audits?\n\n3. According to PI 041-1, what are some of the recommended practices for the quality unit of the contract giver in verifying source electronic data and metadata, and how does it suggest incorporating a quality risk management approach in routine surveillance and audits?", "prev_section_summary": "This section discusses data integrity considerations for outsourced activities in the pharmaceutical industry supply chain. Key topics include the involvement of multiple partner companies, the importance of data integrity in ensuring supply chain security, the need for initial and periodic re-qualification of supply chain partners, the limitations of data integrity in information obtained from the supply chain, routine document verification, and strategies for assessing data integrity in the supply chain. Entities mentioned include API producers, dosage form manufacturers, analytical laboratories, wholesale and distribution organizations, outsourced service providers, and consultants.", "excerpt_keywords": "Data integrity, Quality agreements, Audits, Supply chain, Contract manufacturing"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n|area for review|rationale|\n|---|---|\n|comparison of analytical data reported by the contractor or supplier vs in-house data from analysis of the same material|to look for discrepant data which may be an indicator of falsification|\n\n## 10.3.2 quality agreements (or equivalent)\n\nshould be in place between manufacturers and suppliers of materials, service providers, contract manufacturing organisations (cmos) and (in the case of distribution) suppliers of medicinal products, with specific provisions for ensuring data integrity across the supply chain. this may be achieved by setting out expectations for data governance, and transparent error/deviation reporting by the contract acceptor to the contract giver. there should also be a requirement to notify the contract giver of any data integrity failures identified at the contract acceptor site.\n\n## 10.3.3 audits of suppliers and manufacturers of apis, critical intermediate suppliers, primary and printed packaging materials suppliers, contract manufacturers and service providers\n\nconducted by the manufacturer (or by a third party on their behalf) should include a verification of data integrity measures at the contract organisation. contract acceptors are expected to provide reasonable access to data generated on behalf of the contract giver during audits, so that compliance with data integrity and management principles can be assessed and demonstrated.\n\n## 10.3.4 audits and routine surveillance\n\nshould include adequate verification of the source electronic data and metadata by the quality unit of the contract giver using a quality risk management approach. this may be achieved by site audit measures such as:\n\n- review the contract acceptors organisational behaviour, and understanding of data governance, data lifecycle, risk and criticality.\n- material testing vs coa: compare the results of analytical testing vs suppliers reported coa. examine discrepancies in accuracy, precision or purity results. this may be performed on a routine basis, periodically, or unannounced, depending on material and supplier risks. periodic proficiency testing of samples may be considered where relevant.\n- remote data review: the contract giver may consider offering the contracted facility/supplier use of their own hardware and software system (deployed over a wide area network) to use in batch manufacture and testing. the contract giver may monitor the quality and integrity of the data generated by the contracted facility personnel in real time. in this situation, there should be segregation of duties to ensure that contract giver monitoring of data does not give provision for amendment of data generated by the contract acceptor.\n\npi 041-1 55 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9a9d78ea-90ec-49e1-b40d-ead014424cbf": {"__data__": {"id_": "9a9d78ea-90ec-49e1-b40d-ead014424cbf", "embedding": null, "metadata": {"page_label": "56", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Good Manufacturing Practices in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. How does the document suggest handling client-confidential information in the context of contract acceptors and givers to maintain data integrity without breaking confidentiality obligations?\n \n2. What specific PIC/S guides and sections are referenced in the document for ensuring the authenticity and accuracy of supplied documentation in the pharmaceutical industry, particularly concerning data integrity and traceability risks?\n\n3. What are the ALCOA principles mentioned in the document, and how do they correspond to specific sections in the PIC/S Guide to Good Manufacturing Practice for Medicinal Products and the PIC/S Guide to Good Distribution Practice for Medicinal Products?", "prev_section_summary": "The section discusses the importance of data integrity in supply chain relationships, particularly between manufacturers and suppliers. It emphasizes the need for quality agreements with specific provisions for data governance and error reporting. The section also highlights the importance of audits of suppliers and manufacturers to verify data integrity measures, with expectations set for contract acceptors during these audits. Additionally, it recommends practices for the quality unit of the contract giver to verify electronic data and metadata, incorporating a quality risk management approach in routine surveillance and audits. Key topics include comparison of analytical data, quality agreements, audits of suppliers and manufacturers, and routine surveillance measures such as material testing, remote data review, and segregation of duties. Key entities mentioned are manufacturers, suppliers, contract manufacturing organisations (CMOs), contract acceptors, and the quality unit of the contract giver.", "excerpt_keywords": "Data integrity, Good Manufacturing Practices, Pharmaceutical Industry, PIC/S guides, ALCOA principles"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## quality monitoring\n\nquality and performance monitoring may indicate incentive for data falsification (e.g. raw materials which marginally comply with specification on a frequent basis.\n\n## 10.3.5\n\ncontract givers may work with the contract acceptor to ensure that all client-confidential information is encoded to de-identify clients. this would facilitate review of source electronic data and metadata at the contract givers site, without breaking confidentiality obligations to other clients. by reviewing a larger data set, this enables a more robust assessment of the contract acceptors data governance measures. it also permits a search for indicators of data integrity failure, such as repeated data sets or data which does not demonstrate the expected variability.\n\n## 10.3.6\n\ncare should be taken to ensure the authenticity and accuracy of supplied documentation (refer section 8.11). the difference in data integrity and traceability risks between true copy and summary report data should be considered when making contractor and supply chain qualification decisions.\n\n## 11 regulatory actions in response to data integrity findings\n\n## 11.1 deficiency references\n\nthe integrity of data is fundamental to good manufacturing practice and the requirements for good data management are embedded in the current pic/s guides to gmp/gdp for medicinal products. the following table provides a reference point highlighting some of these existing requirements.\n\n|alcoa principle|pic/s guide to good manufacturing practice for medicinal products, pe 009 (part i):|pic/s guide to good manufacturing practice for medicinal products, pe 009 (part ii):|pic/s guide (computerised systems) annex 11:|pic/s guide to good distribution practice for medicinal products, pe 011:|\n|---|---|---|---|---|\n|attributable|[4.20, c & f], [4.21, c & i], [4.29 point 5]|[5.43], [6.14], [6.18], [6.52]|[2], [12.1], [12.4], [15]|[4.2.4], [4.2.5]|\n|legible|[4.1], [4.2], [4.7], [4.8], [4.9], [4.10]|[6.11], [6.14], [6.15], [6.50]|[4.2.3], [4.2.9]|[4.8], [7.1], [7.2], [8.1], [9], [10], [17]|\n|contemporaneous|[4.8]|[6.14]|[12.4], [14]|[4.1], [4.2.9]|\n|original|[4.9], [4.27], [paragraph \"record\"]|[6.14], [6.15], [6.16]|[8.2], [9]|[4.2.5]|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "22b4ec3b-e384-4251-8691-8b2093a39ad6": {"__data__": {"id_": "22b4ec3b-e384-4251-8691-8b2093a39ad6", "embedding": null, "metadata": {"page_label": "57", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Deficiencies and Their Impact on Product Quality: A Comprehensive Classification and Analysis", "questions_this_excerpt_can_answer": "1. How does the PIC/S guidance define a critical deficiency in the context of data integrity and product quality within regulated environments?\n \n2. What are the implications of data integrity deficiencies according to the PIC/S guidance, specifically in terms of their impact on product quality and the classification of such deficiencies?\n\n3. Can you detail the types of situations or practices that may lead to a classification of data integrity deficiencies as critical, especially in relation to fraud, misrepresentation, or falsification within the pharmaceutical industry as outlined in the document?", "prev_section_summary": "The section discusses the importance of quality monitoring in detecting data falsification in regulated environments. It also addresses the handling of client-confidential information to maintain data integrity without breaching confidentiality obligations. The document references specific PIC/S guides and sections for ensuring the authenticity and accuracy of supplied documentation in the pharmaceutical industry. Additionally, it outlines the ALCOA principles and their correspondence to sections in the PIC/S Guide to Good Manufacturing Practice for Medicinal Products and the PIC/S Guide to Good Distribution Practice for Medicinal Products. The section emphasizes the fundamental role of data integrity in good manufacturing practice and regulatory actions in response to data integrity findings.", "excerpt_keywords": "Data Integrity, Product Quality, PIC/S Guidance, Regulatory Environments, Pharmaceutical Industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n|accurate|[4.1], [6.17]|[5.40], [5.42]|[paragraph \"principles\"]|[4.2.3]|\n|---|---|---|---|---|\n| | |[5.45], [5.46]| | |\n| | |[5.47], [6.6]|[4.8], [5], [6], [7.2], [10], [11]| |\n|complete|[4.8]|[6.16], [6.50]|[4.8], [7.1]|[4.2.3]|\n| | |[6.60], [6.61]|[7.2], [9]|[4.2.5]|\n|consistent|[4.2]|[6.15], [6.50]|[4.8], [5]|[4.2.3]|\n|enduring|[4.1], [4.10]|[6.11], [6.12], [6.14]|[7.1], [17]|[4.2.6]|\n|available|[paragraph \"principle\"]|[6.12], [6.15], [6.16]|[3.4], [7.1], [16], [17]|[4.2.1]|\n| |[4.1]| | | |\n\n11.2 classification of deficiencies\n\nnote: the following guidance is intended to aid consistency in reporting and classification of data integrity deficiencies, and is not intended to affect the inspecting authoritys ability to act according to its internal policies or national regulatory frameworks.\n\n11.2.1 deficiencies relating to data integrity failure may have varying impact to product quality. prevalence of the failure may also vary between the actions of a single employee to an endemic failure throughout the inspected organisation.\n\n11.2.2 the pic/s guidance on classification of deficiencies states: \"a critical deficiency is a practice or process that has produced, or leads to a significant risk of producing either a product which is harmful to the human or veterinary patient or a product which could result in a harmful residue in a food producing animal. a critical deficiency also occurs when it is observed that the manufacturer has engaged in fraud, misrepresentation or falsification of products or data\".\n\n11.2.3 notwithstanding the \"critical\" classification of deficiencies relating to fraud, misrepresentation or falsification, it is understood that data integrity deficiencies can also relate to:\n\n- data integrity failure resulting from bad practice,\n- opportunity for failure (without evidence of actual failure) due to absence of the required data control measures.\n\n11.2.4 in these cases, it may be appropriate to assign classification of deficiencies by taking into account the following (indicative list only):\n\npi 040 pic/s guidance on classification of gmp deficiencies\n\npi 041-1 57 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d1e3e987-6db7-45d5-9f73-4a6acd9e7a09": {"__data__": {"id_": "d1e3e987-6db7-45d5-9f73-4a6acd9e7a09", "embedding": null, "metadata": {"page_label": "58", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Assessment of Data Integrity Deficiencies in Pharmaceutical Quality Systems: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. What specific examples of data integrity deficiencies are classified as critical deficiencies due to their actual or potential risk to patient health within regulated pharmaceutical environments, as outlined in the PI 041-1 Good Practices for Data Management and Integrity document?\n\n2. How does the document differentiate between major deficiencies that have no risk to patient health and those that have no impact on product quality but evidence moderate failure in terms of data integrity within pharmaceutical quality systems?\n\n3. According to the PI 041-1 document, what are the key elements that need to be assessed to determine the adequacy of a pharmaceutical company's data governance process, and how do these elements contribute to the overall assessment of company-wide failure or deficiency of limited scope/impact in data management and integrity?", "prev_section_summary": "This section discusses the classification of data integrity deficiencies in regulated environments according to PIC/S guidance. It defines critical deficiencies as practices or processes that pose a significant risk to product quality, including those related to fraud, misrepresentation, or falsification. The section also addresses the varying impact of deficiencies on product quality and the importance of consistent reporting and classification. Key entities mentioned include the inspecting authority, manufacturer, and PIC/S guidance on classification of deficiencies.", "excerpt_keywords": "Data Integrity, Pharmaceutical Quality Systems, Critical Deficiency, Data Governance, Regulatory Action"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## impact to product with actual or potential risk to patient health: critical deficiency:\n\nproduct failing to meet marketing auporisation specification at release or wipin shelf life.\nreporting of a desired result raper pan an actual out of specification result when reporting of qc tests, critical product or process parameters.\nwide-ranging misrepresentation or falsification of data, wip or wipout pe knowledge and assistance of senior management, pe extent of which critically undermines pe reliability of pe pharmaceutical quality system and erodes all confidence in pe quality and safety of medicines manufactured or handled by pe site.\n\n## impact to product with no risk to patient health: major deficiency:\n\ndata being misreported, e.g. original results in specification, but altered to give a more favourable trend.\nreporting of a desired result raper pan an actual out of specification result when reporting of data which does not relate to qc tests, critical product or process parameters.\nfailures arising from poorly designed data capture systems (e.g. using scraps of paper to record info for later transcription).\n\n## no impact to product; evidence of moderate failure: major deficiency:\n\nbad practices and poorly designed systems which may result in opportunities for data integrity issues or loss of traceability across a limited number of functional areas (qa, production, qc etc.). each in its own right has no direct impact to product quality.\n\n## no impact to product; limited evidence of failure: other deficiency:\n\nbad practice or poorly designed system which result in opportunities for data integrity issues or loss of traceability in a discrete area.\nlimited failure in an operwise acceptable system, e.g. manipulation of non-critical data by an individual.\n\nit is important to build an overall picture of the adequacy of the key elements (data governance process, design of systems to facilitate compliant data recording, use and verification of audit trails and it user access etc.) to make a robust assessment as to whether there is a company-wide failure, or a deficiency of limited scope/ impact.\n\nindividual circumstances (exacerbating / mitigating factors) may also affect final classification or regulatory action. further guidance on the classification of deficiencies and intra-authority reporting of compliance issues will be available in the pic/s guidance on the classification of deficiencies pi 040.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1eb6334f-6f33-4513-834d-7d83a20655ea": {"__data__": {"id_": "1eb6334f-6f33-4513-834d-7d83a20655ea", "embedding": null, "metadata": {"page_label": "59", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Remediation Plan: Investigation, Root Cause Analysis, Risk Assessment, Corrective and Preventive Actions", "questions_this_excerpt_can_answer": "1. What specific steps should a company take as part of a remediation plan when responding to significant data integrity issues, according to the guidelines provided in PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)?\n\n2. How does PI 041-1 recommend conducting a comprehensive investigation into data integrity failures, including the scope of the investigation and the methodology for identifying root causes and assessing risks?\n\n3. What are the guidelines for implementing corrective and preventive actions to address data integrity vulnerabilities as outlined in PI 041-1, and what considerations should be made for multinational companies or those operating across multiple sites?", "prev_section_summary": "The section discusses the classification of data integrity deficiencies in pharmaceutical quality systems, distinguishing between critical deficiencies with a risk to patient health, major deficiencies with no risk to patient health, and moderate or limited failures. It highlights examples of each type of deficiency, such as misreporting of data, falsification of results, and poorly designed data capture systems. The importance of assessing key elements like data governance processes, system design, audit trails, and user access is emphasized to determine the overall scope and impact of deficiencies within a pharmaceutical company. Individual circumstances and regulatory actions are also mentioned as factors that may influence the classification of deficiencies.", "excerpt_keywords": "Data Integrity, Remediation Plan, Investigation, Root Cause Analysis, Risk Assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## remediation of data integrity failures\n\n|12|remediation of data integrity failures|\n|---|---|\n|12.1|responding to significant data integrity issues|\n|12.1.1|consideration should be primarily given to resolving the immediate issues identified and assessing the risks associated with the data integrity issues. the response by the company in question should outline the actions taken as part of a remediation plan. responses from implicated manufacturers should include:|\n|12.1.1.1|a comprehensive investigation into the extent of the inaccuracies in data records and reporting, to include:|\n\n- a detailed investigation protocol and methodology; a summary of all laboratories, manufacturing operations, products and systems to be covered by the assessment; and a justification for any part of the operation that the regulated user proposes to exclude\n- interviews of current and where possible and appropriate, former employees to identify the nature, scope, and root cause of data inaccuracies. these interviews may be conducted by a qualified third party;\n- an assessment of the extent of data integrity deficiencies at the facility. identify omissions, alterations, deletions, record destruction, non-contemporaneous record completion, and other deficiencies;\n- determination of the scope (data, products, processes and specific batches) and timeframe for the incident, with justification for the time-boundaries applied;\n- a description of all parts of the operations in which data integrity lapses occurred, additional consideration should be given to global corrective actions for multinational companies or those that operate across multiple sites;\n- a comprehensive retrospective evaluation of the nature of the data integrity deficiencies, and the identification of root cause(s) or most likely root cause that will form the basis of corrective and preventative actions, as defined in the investigation protocol. the services of a qualified third-party consultant with specific expertise in the areas where potential breaches were identified may be required;\n- a risk assessment of the potential effects of the observed failures on the quality of the substances, medicines, and products involved. the assessment should include analyses of the potential risks to patients caused by the release/distribution of products affected by a lapse of data integrity, risks posed by ongoing operations, and any impact on the integrity of data submitted to regulatory agencies, including data related to product registration dossiers.\n\n12.1.1.2\ncorrective and preventive actions taken to address the data integrity vulnerabilities and timeframe for implementation, and including:\nthe scope of the investigation should include an assessment of the extent of data integrity at the corporate level, including all facilities, sites and departments that could potentially be affected.\n\npi 041-1 59 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3437af41-eb15-4184-b435-44b5393665b9": {"__data__": {"id_": "3437af41-eb15-4184-b435-44b5393665b9", "embedding": null, "metadata": {"page_label": "60", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Measures for Addressing Data Integrity Issues in Pharmaceutical Companies: Strategies for Ensuring Accuracy and Compliance", "questions_this_excerpt_can_answer": "1. What specific interim measures are recommended for pharmaceutical companies to protect patients and ensure the quality of medicinal products when addressing data integrity issues?\n \n2. How does the document suggest pharmaceutical companies should demonstrate the effectiveness of their corrective and preventive actions (CAPA) in response to data integrity lapses?\n\n3. What are the key components that should be included in a management strategy submitted to the regulatory authority by a company implicated in data integrity issues, according to the document?", "prev_section_summary": "This section discusses the remediation of data integrity failures in regulated environments, outlining specific steps that should be taken as part of a remediation plan. Key topics include responding to significant data integrity issues, conducting a comprehensive investigation into data integrity failures, implementing corrective and preventive actions, and considerations for multinational companies or those operating across multiple sites. Entities mentioned include implicated manufacturers, current and former employees, qualified third-party consultants, and regulatory agencies. The section emphasizes the importance of identifying root causes of data inaccuracies, assessing risks, and taking appropriate actions to address data integrity vulnerabilities.", "excerpt_keywords": "Pharmaceutical companies, Data integrity, Remediation plan, Corrective actions, Regulatory authority"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## interim measures\n\ndescribing the actions to protect patients and to ensure the quality of the medicinal products, such as notifying customers, recalling product, conducting additional testing, adding lots to the stability program to assure stability, drug application actions, and enhanced complaint monitoring. interim measures should be monitored for effectiveness and residual risks should be communicated to senior management, and kept under review.\n\n## long-term measures\n\ndescribing any remediation efforts and enhancements to procedures, processes, methods, controls, systems, management oversight, and human resources (e.g. training, staffing improvements) designed to ensure the data integrity. where long-term measures are identified, interim measures should be implemented to mitigate risks.\n\n## capa effectiveness checks\n\nimplemented to monitor if the actions taken have eliminated the issue.\n\n## meeting with implicated companies\n\nwhenever possible, inspectorates should meet with senior representatives from the implicated companies to convey the nature of the deficiencies identified and seek written confirmation that the company commits to a comprehensive investigation and a full disclosure of issues and their prompt resolution. a management strategy should be submitted to the regulatory authority that includes the details of the global corrective action and preventive action plan. the strategy should include:\n\n- a comprehensive description of the root causes of the data integrity lapses, including evidence that the scope and depth of the current action plan is commensurate with the findings of the investigation and risk assessment. this should indicate if individuals responsible for data integrity lapses remain able to influence gmp/gdp-related or drug application data.\n- a detailed corrective action plan that describes how the regulated user intends to ensure the aloca+ attributes (see section 7.4) of all of the data generated, including analytical data, manufacturing records, and all data submitted or presented to the competent authority.\n\n## management of significant data integrity issues\n\ninspectorates should implement policies for the management of significant data integrity issues identified at inspection in order to manage and contain risks associated with the data integrity breach.\n\n## indicators of improvement\n\nan on-site inspection is recommended to verify the effectiveness of actions taken to address serious data integrity issues. alternative approaches to verify effective remediation may be considered in accordance with risk management principles. some indicators of improvement are:\n\n- evidence of a thorough and open evaluation of the identified issue and timely implementation of effective corrective and preventive actions, including appropriate implementation of corrective and preventive actions at an organizational level.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4ee1ebd4-dc9a-413a-9a8e-232958588151": {"__data__": {"id_": "4ee1ebd4-dc9a-413a-9a8e-232958588151", "embedding": null, "metadata": {"page_label": "61", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Effective Communication in Regulatory Compliance", "questions_this_excerpt_can_answer": "1. What specific practices are recommended for maintaining transparency and open communication with clients and regulators during the investigation and remediation stages of data integrity issues in regulated environments?\n\n2. How does the document suggest a regulated user should evaluate the vulnerability of electronic systems to data manipulation, and what external resources might they need to ensure comprehensive resolution of violations?\n\n3. Can you detail the definitions provided for \"audit trail,\" \"data flow map,\" and \"data governance\" within the context of Good Practices for Data Management and Integrity in Regulated Environments, as outlined in the document?", "prev_section_summary": "The section discusses interim and long-term measures for addressing data integrity issues in pharmaceutical companies, including actions to protect patients and ensure product quality, remediation efforts, and enhancements to procedures. It also covers the effectiveness checks of corrective and preventive actions (CAPA), meetings with implicated companies, management of significant data integrity issues, and indicators of improvement. Key entities mentioned include inspectorates, senior representatives from implicated companies, regulatory authorities, and regulated users.", "excerpt_keywords": "Data Integrity, Regulatory Compliance, Communication, Electronic Systems, Audit Trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## 12.2.1.2 evidence of open communication of issues with clients and other regulators.\n\ntransparent communication should be maintained throughout the investigation and remediation stages. regulators should be aware that further data integrity failures may be reported as a result of the detailed investigation. any additional reaction to these notifications should be proportionate to public health risks, to encourage continued reporting;\n\n## 12.2.1.3 evidence of communication of data integrity expectations across the organisation, incorporating and encouraging processes for open reporting of potential issues and opportunities for improvement;\n\n## 12.2.1.4 the regulated user should ensure that an appropriate evaluation of the vulnerability of electronic systems to data manipulation takes place to ensure that follow-up actions have fully resolved all the violations. for this evaluation the services of qualified third party consultant with the relevant expertise may be required;\n\n## 12.2.1.5 implementation of data integrity policies in line with the principles of this guide;\n\n## 12.2.1.6 implementation of routine data verification practices.\n\n## 13 glossary\n\n|archiving|long term, permanent retention of completed data and relevant metadata in its final form for the purposes of reconstruction of the process or activity.|\n|---|---|\n|audit trail|gmp/gdp audit trails are metadata that are a record of gmp/gdp critical information (for example the creation, modification, or deletion of gmp/gdp relevant data), which permit the reconstruction of gmp/gdp activities.|\n|back-up|a copy of current (editable) data, metadata and system configuration settings (e.g. variable settings which relate to an analytical run) maintained for the purpose of disaster recovery.|\n|computerised system|a system including the input of data, electronic processing and the output of information to be used either for reporting or automatic control.|\n|data|facts, figures and statistics collected together for reference or analysis.|\n|data flow map|a graphical representation of the \"flow\" of data through an information system.|\n|data governance|the sum total of arrangements to ensure that data, irrespective of the format in which it is generated, recorded, processed, retained and used to ensure a complete, consistent and accurate record throughout the data lifecycle.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "dfb3836c-c403-4286-b25f-d9d3ddd6283d": {"__data__": {"id_": "dfb3836c-c403-4286-b25f-d9d3ddd6283d", "embedding": null, "metadata": {"page_label": "62", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Quality in the Data Lifecycle: Key Concepts and Practices\"", "questions_this_excerpt_can_answer": "1. What are the key characteristics that define data integrity according to the document \"Ensuring Data Integrity and Quality in the Data Lifecycle: Key Concepts and Practices\"?\n\n2. How does the document describe the concept of a \"dynamic record\" and its significance in the context of data management and integrity?\n\n3. What specific practices does the document recommend for ensuring that documentation, whether paper or electronic, meets data management and integrity principles under the section titled \"good documentation practices (gdocp)\"?", "prev_section_summary": "This section focuses on the importance of open communication with clients and regulators during data integrity issues in regulated environments. It emphasizes the need for transparent communication, evaluation of electronic systems' vulnerability to data manipulation, implementation of data integrity policies, and routine data verification practices. The glossary provides definitions for key terms such as archiving, audit trail, backup, computerized system, data flow map, and data governance, which are essential for understanding and implementing good practices for data management and integrity in regulated environments.", "excerpt_keywords": "data integrity, data lifecycle, data quality, dynamic record, good documentation practices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\n## data integrity\n\nthe degree to which data are complete, consistent, accurate, trustworthy, reliable and that these characteristics of the data are maintained throughout the data life cycle. the data should be collected and maintained in a secure manner, so that they are attributable, legible, contemporaneously recorded, original (or a true copy) and accurate. assuring data integrity requires appropriate quality and risk management systems, including adherence to sound scientific principles and good documentation practices. the data should comply with alcoa+ principles.\n\n## data lifecycle\n\nall phases in the life of the data (including raw data) from initial generation and recording through processing (including transformation or migration), use, data retention, archive / retrieval and destruction.\n\n## data quality\n\nthe assurance that data produced is exactly what was intended to be produced and fit for its intended purpose. this incorporates alcoa + principles.\n\n## data ownership\n\nthe allocation of responsibilities for control of data to a specific process owner. companies should implement systems to ensure that responsibilities for systems and their data are appropriately allocated and responsibilities undertaken.\n\n## dynamic record\n\nrecords, such as electronic records, that allow an interactive relationship between the user and the record content.\n\n## exception report\n\na validated search tool that identifies and documents predetermined abnormal data or actions, which require further attention or investigation by the data reviewer.\n\n## good documentation practices (gdocp)\n\nthose measures that collectively and individually ensure documentation, whether paper or electronic, meet data management and integrity principles, e.g. alcoa+.\n\n## hybrid systems\n\na system for the management and control of data that typically consists of an electronic system generating electronic data, supplemented by a defined manual system that typically generate a paper-based record. the complete data set from a hybrid system therefore consists of both electronic and paper data together. hybrid systems rely on the effective management of both sub-systems for correct operation.\n\n## master document\n\nan original approved document from which controlled copies for distribution or use can be made.\n\n14 gxp data integrity guidance and definitions, mhra, march 2018\n\npi 041-1 62 of 63 1 july 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "da99a6a3-c1d6-4349-ba2a-30679a9458ef": {"__data__": {"id_": "da99a6a3-c1d6-4349-ba2a-30679a9458ef", "embedding": null, "metadata": {"page_label": "63", "file_name": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf", "file_type": "application/pdf", "file_size": 975750, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Management and Quality Control in the Pharmaceutical Industry: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What is the role of metadata in ensuring data integrity within regulated environments, as outlined in the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document?\n\n2. How does the document \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" define the responsibilities of the Quality Unit within a pharmaceutical company?\n\n3. According to the \"PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021)\" document, what distinguishes raw data from static records in the context of data management and quality control in the pharmaceutical industry?", "prev_section_summary": "The section discusses key concepts related to data integrity, data lifecycle, data quality, data ownership, dynamic records, exception reports, good documentation practices, hybrid systems, and master documents. It defines data integrity as the completeness, consistency, accuracy, trustworthiness, and reliability of data throughout its lifecycle. It emphasizes the importance of maintaining data in a secure and compliant manner, adhering to quality and risk management systems, and following sound scientific principles and good documentation practices. The section also highlights the significance of dynamic records, exception reports, and hybrid systems in ensuring data integrity and quality. Additionally, it mentions the allocation of responsibilities for data control to specific process owners and the use of master documents for distribution and use.", "excerpt_keywords": "metadata, quality unit, raw data, static record, supply chain"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[12] PI 041-1 Good Practices for Data Management and Integrity in Regulated Environments (2021).pdf\nmetadata\n\nin-file data that describes the attributes of other data, and provides context and meaning. typically, these are data that describe the structure, data elements, inter-relationships and other characteristics of data e.g. audit trails. metadata also permit data to be attributable to an individual (or if automatically generated, to the original data source). metadata form an integral part of the original record. without the context provided by metadata the data has no meaning.\n\nquality unit\n\nthe department within the regulated entity responsible for oversight of quality including in particular the design, effective implementation, monitoring and maintenance of the pharmaceutical quality system.\n\nraw data\n\nraw data is defined as the original record (data) which can be described as the first-capture of information, whether recorded on paper or electronically. information that is originally captured in a dynamic state should remain available in that state.\n\nstatic record\n\na record format, such as a paper or electronic record, that is fixed and allows little or no interaction between the user and the record content.\n\nsupply chain\n\nthe sum total of arrangements between manufacturing sites, wholesale and distribution sites that ensure that the quality of medicines in ensured throughout production and distribution to the point of sale or use.\n\nsystem administrator\n\na person who manages the operation of a computerised system or particular electronic communication service.\n\nrevision history\n\n|date|version number|reasons for revision|\n|---|---|---|\n|pi 041-1|63 of 63|1 july 2021|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bc88fad5-d3e8-49e8-86de-5866b7df96c0": {"__data__": {"id_": "bc88fad5-d3e8-49e8-86de-5866b7df96c0", "embedding": null, "metadata": {"page_label": "1", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Navigating the Nexus of People, Process, and Technology: An Exploration of Techiriology", "questions_this_excerpt_can_answer": "Based on the provided context and the unique nature of the document titled \"Navigating the Nexus of People, Process, and Technology: An Exploration of Techiriology\" from the ISPE Records and Data Integrity Guide, here are three questions that this context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the definition of \"techiriology\" as explored within the pharmaceutical industry's context of records and data integrity?**\n - Given the unique term \"techiriology\" mentioned in the document title and its exploration within the nexus of people, process, and technology, this document is likely to offer a specialized definition or conceptual framework that is specific to the pharmaceutical industry's approach to records and data integrity. This definition or framework might not be readily available in general industry literature or standard dictionaries, making the document a primary source for understanding the term in this specific context.\n\n2. **How does the ISPE guide propose to integrate people, processes, and technology to enhance records and data integrity in pharmaceutical development and manufacturing?**\n - The document appears to provide a comprehensive exploration of integrating human elements, procedural aspects, and technological advancements to safeguard and enhance data integrity within the pharmaceutical sector. This question seeks to uncover the specific strategies, models, or best practices recommended by the ISPE guide that are tailored to the unique challenges and requirements of the pharmaceutical industry, which might not be covered in such a detailed or contextualized manner in more general industry guides or academic literature.\n\n3. **What are the unique challenges and solutions identified in the \"Navigating the Nexus of People, Process, and Technology: An Exploration of Techiriology\" document regarding ensuring data integrity in pharmaceutical operations?**\n - This question aims to extract specific insights or case studies presented in the document that highlight unique challenges faced by the pharmaceutical industry in maintaining data integrity across its operations. Furthermore, it seeks to understand the innovative solutions or methodologies proposed by the document that address these challenges, leveraging the interplay between people, processes, and technology. Given the document's specialized focus, the challenges and solutions discussed are likely to offer valuable perspectives that are not readily available in more generic discussions on data integrity in other sectors.\n\nThese questions leverage the higher-level summaries and the unique focus of the document to generate inquiries that are deeply rooted in the specialized content it promises to deliver, ensuring that the answers provided are both specific and contextually relevant to the pharmaceutical industry's data integrity concerns.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Industry, Techiriology"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n# people\n\n# process\n\n# \"techiriology", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b32a5ca8-89f3-4f6a-a6bb-898036702a76": {"__data__": {"id_": "b32a5ca8-89f3-4f6a-a6bb-898036702a76", "embedding": null, "metadata": {"page_label": "2", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Unique Entities and Themes\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's storage details within a particular project environment, which would not be commonly available information. The context directly provides the file size as 6245344 bytes.\n\n2. **What are the creation and last modification dates of the \"ISPE Records and Data Integrity Guide\" PDF within the PharmaWise CSV & Data Integrity project?**\n - The context uniquely specifies the creation and last modification dates of the document, which are 2024-04-07 and 2024-04-04, respectively. This information is specific to the document's version stored in the mentioned project and would not be easily found elsewhere.\n\n3. **Under what title is the \"ISPE Records and Data Integrity Guide\" listed in the PharmaWise Engineer project's raw data directory?**\n - The context provides a unique document title, \"Blank Canvas: A Collection of Unique Entities and Themes,\" which seems to be an unusual title for a guide related to records and data integrity. This question targets the specific naming or cataloging convention used within the project, which is unlikely to be replicated or relevant in other contexts.\n\nThese questions are tailored to the unique identifiers and metadata provided in the context, focusing on the document's storage, versioning, and cataloging specifics within a project environment.", "prev_section_summary": "The key topics of the section include the exploration of \"techiriology\" within the pharmaceutical industry's context of records and data integrity, the integration of people, processes, and technology to enhance data integrity in pharmaceutical development and manufacturing, and the unique challenges and solutions identified in ensuring data integrity in pharmaceutical operations. The section focuses on the interplay between people, processes, and technology in addressing data integrity concerns specific to the pharmaceutical industry.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5940533d-c963-454d-8d56-7c03b5edb2f8": {"__data__": {"id_": "5940533d-c963-454d-8d56-7c03b5edb2f8", "embedding": null, "metadata": {"page_label": "3", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE GAMP Records and Data Integrity Guide: Disclaimer, Limitation of Liability, and Copyright Information", "questions_this_excerpt_can_answer": "1. What is the primary purpose of the ISPE GAMP Records and Data Integrity Guide, and how does ISPE position its effectiveness in ensuring regulatory compliance?\n \n2. What are the limitations of liability outlined by ISPE for users of the GAMP Records and Data Integrity Guide, and what types of damages are specifically mentioned as not being covered?\n\n3. What are the copyright restrictions associated with the ISPE GAMP Records and Data Integrity Guide, and what forms of reproduction or copying are explicitly prohibited?", "prev_section_summary": "The key topics and entities of this section focus on the specific details related to the document \"ISPE Records and Data Integrity Guide\" stored in the PharmaWise Engineer project on Google Drive. These details include the file size, creation and last modification dates, and the unique document title within the project's raw data directory. The section highlights the importance of understanding the storage, versioning, and cataloging specifics of documents within a project environment for data integrity and management purposes.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Guide"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nispe, the developers of gamp(r) records and data integrity guide\n\ndisclaimer:\n\nthis guide is intended to assist regulated companies in managing records and data throughout the data life cycle. ispe cannot ensure and does not warrant that a system managed in accordance with this guide will be acceptable to regulatory authorities. further, this guide does not replace the need for hiring professional engineers or technicians.\n\nlimitation of liability\n\nin no event shall ispe or any of its affiliates, or the officers, directors, employees, members, or agents of each of them, or the authors, be liable for any damages of any kind, including without limitation any special, incidental, indirect, or consequential damages, whether or not advised of the possibility of such damages, and on any theory of liability whatsoever, arising out of or in connection with the use of this information.\n\n(c) copyright ispe 2017. all rights reserved.\n\nall rights reserved. no part of this document may be reproduced or copied in any form or by any means - graphic, electronic, or mechanical, including photocopying, taping, or information storage and retrieval systems - without written permission of ispe.\n\nall trademarks used are acknowledged.\n\nisbn 978-1-936379-96-5", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cfa567c0-1d26-432e-9dae-19bb11ce53e4": {"__data__": {"id_": "cfa567c0-1d26-432e-9dae-19bb11ce53e4", "embedding": null, "metadata": {"page_label": "4", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Data Integrity in Life Sciences: A Guide to Compliance, Patient Safety, and Ensuring Data Integrity\"", "questions_this_excerpt_can_answer": "1. What is the relationship between data integrity and patient safety as outlined in the ISPE GAMP\u00ae Guide: Records and Data Integrity?\n \n2. How does the ISPE Records and Data Integrity Guide propose to integrate the management of records and data into the quality management system to comply with GxP requirements?\n\n3. According to the document, what are the consequences faced by companies that fail to maintain acceptable data integrity practices as highlighted by regulators and health agencies?", "prev_section_summary": "The section discusses the ISPE GAMP Records and Data Integrity Guide, including a disclaimer stating that the guide is intended to assist regulated companies in managing records and data but does not guarantee regulatory compliance. It also outlines limitations of liability for ISPE and its affiliates, stating they are not liable for any damages arising from the use of the information in the guide. Additionally, the section mentions copyright restrictions, prohibiting reproduction or copying of the document without written permission from ISPE.", "excerpt_keywords": "ISPE, GAMP Guide, Data Integrity, Patient Safety, Quality Management System"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nispe gamp(r) guide: records and data integrity\n\npreface\n\nthe importance of data integrity is reflected in recent guidance, citations, and public comments of regulators and health agencies. a number of companies have suffered serious regulatory and financial consequences as a result of unacceptable data integrity practices.\n\npatient safety is affected by the integrity of critical records, data, and decisions, as well as those aspects concerned with physical attributes of the product. that the phrase \"patient safety, product quality, and data integrity\" is commonly used in regulatory and industry guidance underlines this point.\n\nthe use of information technology and computerized systems in all aspects of life sciences continues to grow and has resulted in the generation of more data to support the development and manufacture of products. key decisions and actions are routinely being made based on this data, and the integrity of the data, whether in electronic or paper form, is of paramount importance to the industry, the regulatory agencies, and ultimately the patient.\n\nindustry will benefit from clear guidance on ensuring that the management of records and data forms an integral part of the quality management system, and is compliant with gxp requirements. this guide intends to provide such guidance and is aligned with ispe gamp(r) 5: a risk-based approach to compliant gxp computerized systems.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "66114eed-9081-4807-b1dc-248f2406c876": {"__data__": {"id_": "66114eed-9081-4807-b1dc-248f2406c876", "embedding": null, "metadata": {"page_label": "5", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE GAMP(R) Records and Data Integrity Guide: Acknowledgements and Contributions", "questions_this_excerpt_can_answer": "1. Who led the task team responsible for producing the ISPE GAMP(R) Records and Data Integrity Guide, and what were their respective organizations and countries?\n \n2. Can you list the core team members who took lead roles in the preparation of the ISPE GAMP(R) Records and Data Integrity Guide, including their affiliations and countries?\n\n3. Which regulatory bodies and subject matter experts provided review and comments on the ISPE GAMP(R) Records and Data Integrity Guide, and what are their affiliations?", "prev_section_summary": "The section discusses the importance of data integrity in the life sciences industry, particularly in relation to patient safety, product quality, and regulatory compliance. It highlights the consequences faced by companies that fail to maintain acceptable data integrity practices and emphasizes the need for clear guidance on integrating the management of records and data into the quality management system to comply with GxP requirements. The section also mentions the use of information technology and computerized systems in generating data to support product development and manufacturing, and the importance of data integrity in making key decisions and actions. The ISPE GAMP\u00ae Guide is referenced as providing a risk-based approach to compliant GxP computerized systems.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Guide"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 3\n\n### acknowledgements\n\nthe ispe gamp(r) records and data integrity guide was produced by a task team led by:\n\n|nigel price|qcdi ltd.|united kingdom|\n|---|---|---|\n|mike rutherford|eli lilly and company|usa|\n|sion wyn|conformity limited|united kingdom|\n\nthe work was supported by the ispe gamp community of practice (cop).\n\n### core team\n\nthe following individuals took lead roles in the preparation of this guide:\n\n|chris clark|tententen consulting|united kingdom|\n|---|---|---|\n|colin jones|conformity limited|united kingdom|\n|tony margetts|factorytalk co., ltd.|thailand|\n|mark newton|eli lilly and company|usa|\n|arthur \"randy\" perez|novartis (retired)|usa|\n|chris reid|integrity solutions ltd.|united kingdom|\n|lorrie vuolo-schuessler|glaxosmithkline|usa|\n|charlie wakeham|waters australia pty. ltd.|australia|\n|christopher white|alexion pharmaceuticals|usa|\n|guy wingate|glaxosmithkline|united kingdom|\n\n### regulatory input and review\n\nparticular thanks go to the following for their review and comments on this guide:\n\n|david churchward|mhra|united kingdom|\n|---|---|---|\n|stephen grayson|mhra|united kingdom|\n|karl-heinz menges|regierungsprasidium darmstadt|germany|\n|ian thrussell|world health organization|united kingdom|\n\n### subject matter expert input and review\n\nparticular thanks go to the following for their review and comments on this guide:\n\n|monica cahilly|green mountain quality assurance, llc|usa|\n|---|---|---|\n|robert mcdowall|r.d. mcdowall ltd.|united kingdom|\n\nthe team would like to special thanks to the global gamp data integrity special interest group (sig) for their efforts.\n\nthe team leads would like to express their grateful thanks to the many individuals and companies from around the world who reviewed and provided comments during the preparation of this guide; although they are too numerous to list here, their input is greatly appreciated.\n\ncompany affiliations are as of the final draft of the guide.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ca5977d2-547f-4300-9767-f9abcf87b5c0": {"__data__": {"id_": "ca5977d2-547f-4300-9767-f9abcf87b5c0", "embedding": null, "metadata": {"page_label": "6", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE Tampa Office Contact Information and Employee Directory", "questions_this_excerpt_can_answer": "1. What is the contact telephone number for the ISPE office located at 600 N. Westshore Blvd., Suite 900, Tampa, Florida?\n \n2. As of the document's last modification in April 2024, what is the fax number for the ISPE Tampa Office?\n\n3. Can you provide the official website URL for the ISPE as listed in their Tampa Office Contact Information and Employee Directory document?", "prev_section_summary": "The section provides acknowledgements and contributions for the ISPE GAMP(R) Records and Data Integrity Guide. It includes information on the task team leaders, core team members, regulatory bodies, subject matter experts, and the global GAMP data integrity special interest group involved in the preparation of the guide. Key topics covered include the organizations and countries of the individuals involved, as well as the regulatory input and review provided by organizations such as MHRA, World Health Organization, and Regierungsprasidium Darmstadt. The section also highlights the contributions of subject matter experts from various countries and expresses gratitude to all individuals and companies who reviewed and provided comments during the guide's preparation.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, Tampa Office"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n600 n. westshore blvd., suite 900, tampa, florida 33609 usa\n\ntel: +1-813-960-2105, fax: +1-813-264-2816\n\nwww.ispe.org", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8b322373-677d-4e65-b3de-5dccfd4e19dd": {"__data__": {"id_": "8b322373-677d-4e65-b3de-5dccfd4e19dd", "embedding": null, "metadata": {"page_label": "7", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Quality Risk Management Guide", "questions_this_excerpt_can_answer": "1. What are the key elements of a data governance framework as outlined in the ISPE Records and Data Integrity Guide, and how do they contribute to ensuring data integrity within pharmaceutical development and manufacturing environments?\n\n2. How does the ISPE Records and Data Integrity Guide address the role of human factors in data integrity, and what strategies does it propose to mitigate risks associated with these factors?\n\n3. Can you detail the stages of the data life cycle as described in the ISPE Records and Data Integrity Guide, and explain how each stage is critical to maintaining the integrity of data in the context of quality risk management in the pharmaceutical industry?", "prev_section_summary": "The key topics of this section are the contact information for the ISPE office located in Tampa, Florida. The section includes the office address, telephone number, fax number, and official website URL for ISPE. The entities mentioned are the ISPE organization and its Tampa office.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical, Quality Risk Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n|content|page number|\n|---|---|\n|introduction|9|\n|background|9|\n|purpose|9|\n|scope|10|\n|structure of this guide|10|\n|key concepts|11|\n|key terms|15|\n|regulatory focus|17|\n|introduction|17|\n|data integrity requirements|17|\n|data governance framework|21|\n|introduction|21|\n|overview|21|\n|elements of the data governance framework|23|\n|human factors in data integrity|30|\n|data integrity maturity model|31|\n|data life cycle|33|\n|introduction|33|\n|data creation|34|\n|data processing|35|\n|data review reporting and use|36|\n|data retention and retrieval|39|\n|data destruction|42|\n|quality risk management|43|\n|introduction|43|\n|process risk assessment|43|\n|quality risk management approach|43|\n|product and process context|45|\n|appendix m1 - corporate data integrity program|47|\n|introduction|47|\n|is a corporate data integrity program required?|47|\n|indicators of program scope and effort|48|\n|implementation considerations|50|\n|keys to success|52|\n|appendix m2 - data integrity maturity model|55|\n|maturity model|55|\n|data integrity maturity level characterization|59|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "dc77f025-ce42-4dd3-90ed-697d15a7e211": {"__data__": {"id_": "dc77f025-ce42-4dd3-90ed-697d15a7e211", "embedding": null, "metadata": {"page_label": "8", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Guide to Data Integrity and Records Management in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide address the role of human factors, such as corporate and local cultures, in records and data integrity within the pharmaceutical industry?\n\n2. What specific guidance does the ISPE GAMP\u00ae Guide offer on the application and use of audit trails for ensuring data integrity, including technical aspects and system design considerations?\n\n3. In the context of preparing for regulatory inspections within the pharmaceutical industry, what key information does the ISPE GAMP\u00ae Guide suggest should be readily available, and what general procedures are recommended for inspection readiness?", "prev_section_summary": "The section from the ISPE Records and Data Integrity Guide covers key topics such as data governance framework, human factors in data integrity, data life cycle, quality risk management, and corporate data integrity program. It outlines the elements of a data governance framework, strategies to mitigate risks associated with human factors, stages of the data life cycle, and approaches to quality risk management. The guide also includes a data integrity maturity model and provides insights on implementing a corporate data integrity program. Key entities discussed include data creation, processing, review, reporting, retention, destruction, process risk assessment, and product and process context within the pharmaceutical industry.", "excerpt_keywords": "ISPE, GAMP Guide, data integrity, records management, pharmaceutical industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n|appendix m3 - human factors|67|\n|---|---|\n|8.1 introduction|67|\n|8.2 corporate and local cultures|67|\n|8.3 classification of incidents|68|\n|8.4 human error|69|\n|8.5 data falsification and fraud|70|\n|8.6 impartiality|71|\n|8.7 behavioral controls|71|\n\n|appendix m4 - data audit trail and audit trail review|75|\n|---|---|\n|9.1 introduction|75|\n|9.2 regulatory background|76|\n|9.3 application and use of audit trails|77|\n|9.4 audit trail review|79|\n|9.5 technical aspects and system design|79|\n\n|appendix m5 - data auditing and periodic review|81|\n|---|---|\n|10.1 introduction|81|\n|10.2 auditing for data integrity|81|\n|10.3 periodic review|82|\n|10.4 other reviews|83|\n|10.5 documenting review processes|83|\n\n|appendix m6 - inspection readiness|85|\n|---|---|\n|11.1 general procedures|85|\n|11.2 key information for regulatory inspections|86|\n\n|appendix m7 - integrating data integrity into existing records management processes|91|\n|---|---|\n|12.1 introduction|91|\n|12.2 record creation|92|\n|12.3 active records|92|\n|12.4 semi-active records|92|\n|12.5 inactive records|92|\n\n## development appendices\n\n|appendix d1 - user requirements|93|\n|---|---|\n|13.1 introduction|93|\n|13.2 business process|93|\n|13.3 general data integrity requirements|94|\n\n|appendix d2 - process mapping and interfaces|99|\n|---|---|\n|14.1 introduction|99|\n|14.2 process flowcharts|99|\n|14.3 data flow diagrams|102|\n|14.4 how much is needed?|103|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "35a2dc57-6271-455d-872d-2aa8263a6bf0": {"__data__": {"id_": "35a2dc57-6271-455d-872d-2aa8263a6bf0", "embedding": null, "metadata": {"page_label": "9", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Records Management and Data Integrity: Controls, Concerns, and Operations\"", "questions_this_excerpt_can_answer": "1. What specific risk control measures are recommended for ensuring the integrity of records, data, and electronic signatures in a pharmaceutical environment, as outlined in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide address data integrity concerns related to different system architectures, including local hard disks, internally managed central databases, and outsourced managed services?\n\n3. What guidelines does the ISPE Records and Data Integrity Guide provide for the retention, archiving, and migration of records in pharmaceutical operations, including considerations for audit trails and converting electronic records to alternative formats or media?", "prev_section_summary": "The section focuses on the ISPE GAMP\u00ae Guide's comprehensive approach to records and data integrity in the pharmaceutical industry. Key topics include human factors, such as corporate and local cultures, incidents classification, human error, data falsification, fraud, impartiality, and behavioral controls. The guide also provides specific guidance on audit trails, including their application, review, technical aspects, and system design considerations. Additionally, it covers auditing for data integrity, periodic review, inspection readiness procedures, and integrating data integrity into existing records management processes. The section also includes development appendices on user requirements, business processes, process mapping, interfaces, and general data integrity requirements.", "excerpt_keywords": "ISPE, Records Management, Data Integrity, Pharmaceutical Industry, Audit Trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n|appendix d3 - risk control measures for records, data, and electronic signatures|105|\n|---|---|\n|15.1 introduction|105|\n|15.2 record and data controls|105|\n|15.3 electronic signature controls|105|\n|15.4 implementation of record and data controls|107|\n|15.5 rigor of controls|110|\n\n|appendix d4 - data integrity concerns related to system architecture|111|\n|---|---|\n|16.1 data resides on a local hard disk|111|\n|16.2 internally managed central database|112|\n|16.3 internally managed distributed data|112|\n|16.4 outsourced managed services|113|\n\n|appendix d5 - data integrity for end-user applications|117|\n|---|---|\n|17.1 introduction|117|\n|17.2 data integrity for spreadsheets|117|\n|17.3 data integrity for pc databases|119|\n|17.4 data integrity for statistical tools|120|\n\n## operation appendices\n\n|appendix o1 - retention, archiving, and migration|121|\n|---|---|\n|18.1 introduction|121|\n|18.2 retention options|121|\n|18.3 protection of records|121|\n|18.4 record aging and risk|122|\n|18.5 archival|122|\n|18.6 hybrid situations and archives|123|\n|18.7 audit trail considerations|124|\n|18.8 alternative systems|125|\n|18.9 converting electronic to alternative format or alternative media hybrids|126|\n\n|appendix o2 - paper records and hybrid situations|131|\n|---|---|\n|19.1 paper records|131|\n|19.2 hybrid situations|133|\n|19.3 use of forms to enforce procedures|135|\n\n## general appendices\n\nappendix g1 - references\n137\n\n|appendix g2 - glossary|141|\n|---|---|\n|21.1 acronyms and abbreviations|141|\n|21.2 definitions|143|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d7a165fa-89cd-4fe1-9e7d-fe0bc87a7c95": {"__data__": {"id_": "d7a165fa-89cd-4fe1-9e7d-fe0bc87a7c95", "embedding": null, "metadata": {"page_label": "10", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Exploring the Void: A Collection of Absences\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's storage details within a particular project environment, which would not be commonly available information. The context directly provides the file size as 6245344 bytes, which is a detail specific to this document's version or copy stored in the mentioned location.\n\n2. **What are the creation and last modification dates of the \"ISPE Records and Data Integrity Guide\" PDF stored in the PharmaWise CSV & Data Integrity project directory?**\n - The context uniquely identifies the creation and last modification dates of the document, which are 2024-04-07 and 2024-04-04, respectively. These dates could be relevant for version control and document management purposes within the project, and such specific information is unlikely to be found in other sources.\n\n3. **Under what title is the \"ISPE Records and Data Integrity Guide\" listed in the PharmaWise Engineer project's raw data directory, and how does it relate to the document's content?**\n - The document is listed under a seemingly unrelated title, \"Exploring the Void: A Collection of Absences,\" which raises questions about the relationship between the title and the document's content. This question probes into the rationale behind the document's titling and classification within the project, which is a context-specific detail not likely to be answered by external sources.\n\nThese questions are tailored to extract information that is uniquely available from the provided context, focusing on document management, project-specific details, and the peculiarities of document titling and classification within a specialized project environment.", "prev_section_summary": "The section focuses on ensuring records management and data integrity in a pharmaceutical environment, as outlined in the ISPE Records and Data Integrity Guide. It covers specific risk control measures for records, data, and electronic signatures, data integrity concerns related to different system architectures, guidelines for retention, archiving, and migration of records, and considerations for audit trails and converting electronic records to alternative formats or media. Key topics include record and data controls, electronic signature controls, data integrity for end-user applications, retention options, protection of records, archival processes, audit trail considerations, and converting electronic records to alternative formats or media.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical, Risk Control"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f91c11b5-0c3f-4610-89b0-6aadb5b1ad03": {"__data__": {"id_": "f91c11b5-0c3f-4610-89b0-6aadb5b1ad03", "embedding": null, "metadata": {"page_label": "11", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE GAMP(R) Guide: Records and Data Integrity - Principles and Practical Guidance", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP(R) Guide: Records and Data Integrity define \"regulated data\" and \"metadata,\" and what is their significance in maintaining data integrity within regulated companies?\n \n2. What are the specific regulatory frameworks and guidelines that the ISPE GAMP(R) Guide: Records and Data Integrity aligns with, particularly in the context of managing GxP regulated records and data to ensure their integrity throughout their lifecycle?\n\n3. What are the potential consequences for a regulated company if it fails to maintain the integrity of its records and data as outlined in the ISPE GAMP(R) Guide: Records and Data Integrity, and how does this guide propose to mitigate such risks?", "prev_section_summary": "The section provides details about a specific document titled \"ISPE Records and Data Integrity Guide\" stored in the PharmaWise Engineer project on Google Drive. It includes information such as the file size (6245344 bytes), creation date (2024-04-07), last modification date (2024-04-04), and the document title \"Exploring the Void: A Collection of Absences.\" The section also presents three specific questions that can be answered based on the context provided, focusing on the document's storage details, creation and modification dates, and the relationship between the document title and its content within the project environment.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Regulatory Frameworks"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n# ispe gamp(r) guide: page 9\n\n## records and data integrity\n\n### 1 introduction\n\n#### 1.1 background\n\nthe impact of record and data integrity issues can be significant on a regulated company. it can result in recalls of products, warning or untitled letters, import alerts, injunctions, seizures, application integrity policy invocations/legal action, and ultimately the potential for patient harm. these regulatory actions can also have a significant financial impact.\n\nthere has been increased regulatory focus on all aspects of data integrity, including publication of specific regulatory guidance on the topic, and increased number of citations in the area.\n\nfor the purposes of this guide:\n\n- regulated data is information used for a regulated purpose or to support a regulated process.\n- \"metadata is data that describes the attributes of other data, and provide context and meaning. typically, these are data that describe the structure, data elements, inter-relationships, and other characteristics of data.\" [1].\n- a regulated record is a collection of regulated data (and any metadata necessary to provide meaning and context) with a specific gxp purpose, content, and meaning, and required by gxp regulations. records include instructions as well as data and reports.\n- \"data integrity is defined as the extent to which all data are complete, consistent and accurate throughout the data life cycle.\" [1]\n- the integrity of records depends on the integrity of underlying data, and signatures executed to electronic records should be trustworthy and reliable. see appendix d3.\n\nthis guide addresses paper records, electronic records, and hybrid situations, while encouraging a move away from hybrid situations, wherever practical.\n\n#### 1.2 purpose\n\nthis ispe gamp(r) guide: records and data integrity provides principles and practical guidance on meeting current expectations for the management of gxp regulated records and data, ensuring that they are complete, consistent, secure, accurate, and available throughout their life cycle. this approach is intended to encourage innovation and technological advance while avoiding unacceptable risk to product quality, patient safety, and public health.\n\nthis guide is intended as a stand-alone guide. it is aligned with ispe gamp(r) 5: a risk-based approach to compliant gxp computerized systems [3]. this guide has been designed so that it can be used in parallel with guidance provided both in ispe gamp(r) 5 [3] and other ispe gamp(r) good practice guides [4].\n\nalthough the scope of this document is wider, it replaces the ispe gamp(r) good practice guide: a risk-based approach to compliant electronic records and signatures.\n\nwithin the us regulatory framework regulated electronic records, and associated signatures, are subject to 21 cfr part 11 [2]. for further information see appendix d3.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3815faec-ea86-4aa0-95f3-681be3b929af": {"__data__": {"id_": "3815faec-ea86-4aa0-95f3-681be3b929af", "embedding": null, "metadata": {"page_label": "12", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE GAMP Guide: Managing Risk to Record and Data Integrity in Regulated Life Science Industries", "questions_this_excerpt_can_answer": "1. What specific regulations and guidance documents does the ISPE GAMP Guide consider when addressing the integrity of GxP records and data within the regulated life science industries?\n \n2. How does the ISPE GAMP Guide propose to integrate existing risk management activities and tools within its framework for managing risk to record and data integrity in regulated life science industries?\n\n3. What unique approach does the ISPE GAMP Guide suggest for applying the quality risk management (QRM) approach from ISPE GAMP(R) 5 to ensure record and data integrity, and how does it integrate with a complete data life cycle approach as part of a Quality Management System (QMS)?", "prev_section_summary": "The section discusses the importance of records and data integrity in regulated companies, highlighting the potential consequences of integrity issues such as product recalls and regulatory actions. It defines regulated data, metadata, regulated records, and data integrity, emphasizing the need for complete, consistent, and accurate data throughout its lifecycle. The guide provides principles and practical guidance for managing GxP regulated records and data, aligning with regulatory frameworks such as 21 CFR Part 11. It encourages innovation while ensuring product quality, patient safety, and public health. The section also addresses paper records, electronic records, and hybrid situations, advocating for a move towards fully electronic records where practical.", "excerpt_keywords": "ISPE, GAMP Guide, Data Integrity, Regulated Life Science Industries, Quality Risk Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide:\n\n### 1.3 scope\n\nthis guide addresses the integrity of gxp records and data used within the regulated life science industries including pharmaceutical, biological, and medical devices. the guidance is intended for regulated companies and suppliers of systems, products, or services in this area, as well as a useful reference for regulators.\n\napplicable life science regulations and guidance have been taken into account, and the following publications have been specifically considered:\n\n- us code of federal regulations (cfrs) [5] covering gcp, glp, gmp, and medical devices\n- us 21 cfr part 11 [2] and associated guidance\n- relevant sections of eu gmps including chapter 4 [6] and annex 11 [7]\n- mhra gmp data integrity definitions and guidance for industry (revision 1.1, march 2015) [1]\n- mhra gxp data integrity definitions and guidance for industry (draft version for consultation july 2016) [8]\n- fda draft guidance for industry: data integrity and compliance with cgmp [9]\n- ich q9 quality risk management [10]\n- ich q10 pharmaceutical quality system [11]\n- who annex 5: guidance on good data and record management practices [12]\n- pic/s draft guidance: good practices for data management and integrity in regulated gmp/gdp environments (draft 2 published august 2016) [13]\n\nthis guide provides a method for managing risk to record and data integrity. regulated companies may already have established risk management activities and tools, and this guide does not intend or imply that these existing methods should be discarded, rather that they continue to be used as appropriate within the context of the overall risk management process described. other methods or techniques giving documented evidence of adequate control, and ensuring appropriate security and integrity, may also be acceptable.\n\nthis guide may also be useful in other regulated areas such as cosmetics and food, or in other areas or sectors where data integrity is important.\n\n### 1.4 structure of this guide\n\nthis guide contains this introduction, a main body, and a set of appendices. it has been structured to meet the needs of various readers, and contains, in increasing level of detail:\n\n1. data integrity requirements, critical areas of regulatory focus and concern, and key concepts\n2. a framework for data governance and the importance of human factors\n3. a complete data life cycle approach as part of a quality management system (qms), from creation to destruction\n4. further information on how to apply the quality risk management (qrm) approach from ispe gamp(r) 5 [3] to record and data integrity", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e38dc33b-2925-45fc-a9f7-14c35ea939f7": {"__data__": {"id_": "e38dc33b-2925-45fc-a9f7-14c35ea939f7", "embedding": null, "metadata": {"page_label": "13", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Records and Data Integrity: Key Concepts and Risk Management Approach", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide recommend managing risks to ensure the integrity of records and data within regulated processes, and what specific methodologies does it reference for this purpose?\n \n2. What are the two main types of documents defined by EU GMP Chapter 4 as per the ISPE Records and Data Integrity Guide, and what examples are provided for each type?\n\n3. According to the ISPE Records and Data Integrity Guide, how is a primary record defined and determined in cases where data collected and retained concurrently by more than one method does not correspond, and which regulatory body provides this definition?", "prev_section_summary": "The section discusses the scope of the ISPE GAMP Guide in addressing the integrity of GxP records and data within regulated life science industries. It considers various regulations and guidance documents, such as US CFRs, EU GMPs, MHRA guidance, FDA draft guidance, ICH guidelines, WHO annex, and PIC/S draft guidance. The guide provides a method for managing risk to record and data integrity, integrating existing risk management activities and tools. It also emphasizes the importance of data governance, human factors, and a complete data life cycle approach within a Quality Management System (QMS). The section is structured to provide detailed information on data integrity requirements, regulatory focus areas, key concepts, data governance framework, QRM approach, and the complete data life cycle.", "excerpt_keywords": "ISPE, Records, Data Integrity, Risk Management, Data Governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n|content|page number|\n|---|---|\n|more detailed information, including \"how to\" guidance for specific topics, in a series of management, development, and operation appendices|11|\n\n### key concepts\n\nthis section describes key concepts that apply throughout this guide.\n\n#### risk management approach\n\na holistic and flexible risk management approach should be used to ensure the integrity of records and data. this is achieved by the application of appropriate controls to manage identified risks within the context of the regulated process. the effort required to assess and manage risk should be commensurate to the level of risk. critical thinking and analysis skills should be applied to identify and adequately control risks to patient safety, product quality, and data integrity. the qrm approach defined in ispe gamp(r) 5 [3], following ich q9 [10], (and also detailed in section 5) can be applied to identifying, assessing, and managing risks to data and record integrity. a full understanding of the regulated process to be supported, including the intended use of data within the process, is fundamental. data integrity cannot be achieved without a complete understanding of the data flow.\n\n#### data governance\n\ndata governance is the sum total of arrangements to ensure that data is recorded, processed, retained and used to ensure a complete, consistent, and accurate record throughout the data life cycle [1]. data governance ensures formal management of records and data throughout the regulated company. data governance encompasses the people, processes, and technology required for effective data handling. see section 3.\n\n#### data life cycle\n\nall phases in data life cycle from initial data creation, capture, and recording through processing (including transformation or migration), review, reporting, retention, retrieval, and destruction should be controlled and managed in order to ensure accurate, reliable, and compliant records and data. see section 4. details of the life cycle will vary depending on the type of documentation. see section 4. two main types of documents are defined by eu gmp chapter 4 [6]:\n\n1. instructions (directions or requirements) type, e.g., specifications, manufacturing formulae, processing, packaging, and testing instructions, sops, protocols, and technical agreements\n2. record/report type, e.g., batch records, laboratory testing results, certificates of analysis, reports\n\nregulated data should be controlled and managed, and integrity of the data ensured, e.g., following the principles and requirements described in this guide. all regulated data is subject to gxp requirements for data integrity and good documentation practices. a primary record is the record which takes priority in cases where data is collected and retained concurrently by more than one method, and the data does not correspond. the primary record attribute should be defined and documented, and should not be changed on a case by case basis. the uk mhra [1] defines a primary record as: \"the record which takes primacy in cases where data that are collected and retained concurrently by more than one method fail to concur.\"", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cb6d1026-38e2-4052-9257-e4e65ce191d9": {"__data__": {"id_": "cb6d1026-38e2-4052-9257-e4e65ce191d9", "embedding": null, "metadata": {"page_label": "14", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Key Concepts for Record and Data Integrity: Alcoa and Alcoa+ Principles Explained", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest using risk management principles to enhance the accuracy and integrity of primary records, especially in the context of choosing between high resolution/dynamic data and low resolution/static data?\n\n2. According to the WHO Annex 5 and PIC/S guidance, what are the foundational principles of good GxP data management, and how do they propose managing risks to data integrity throughout the data life cycle?\n\n3. Can you detail how the ALCOA and ALCOA+ principles are applied throughout the data life cycle, as outlined in Tables 1.1 and 1.2 of the \"Key Concepts for Record and Data Integrity: ALCOA and ALCOA+ Principles Explained\" document, including specific expectations for each principle?", "prev_section_summary": "The section discusses key concepts related to records and data integrity, including the importance of a risk management approach, data governance, and the data life cycle. It emphasizes the need for controls to manage risks and ensure patient safety, product quality, and data integrity. The section also defines two main types of documents according to EU GMP Chapter 4 and explains the concept of a primary record in cases where data does not correspond. Regulatory bodies such as ISPE, ICH, and UK MHRA are referenced for guidance on managing risks and ensuring data integrity.", "excerpt_keywords": "ISPE, GAMP, data integrity, ALCOA, ALCOA+"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\nrisk management principles should be used to ensure that the primary record provides the greatest accuracy, completeness, content, and meaning. for example, high resolution or dynamic (electronic) data should be designated as a primary record in preference to low resolution or static (printed/manual) data. all relevant data should be considered when performing activities such as a risk-based investigation into data anomalies (e.g., out of specification results).\n\n### key concepts summarized by alcoa and alcoa+\n\nboth the who guidance and the draft pic/s good practices for data management and integrity in regulated gmp/gdp environments indicate that key concepts described by the alcoa (and alcoa+) acronyms can help to support record and data integrity.\n\nthe who annex 5: guidance on good data and record management practices states:\n\n- the basic building blocks of good gxp data are to follow gdocp and then to manage risks to the accuracy, completeness, consistency and reliability of the data throughout their entire period of usefulness - that is, throughout the data life cycle.\n- personnel should follow gdocp for both paper records and electronic records in order to assure data integrity.\n\npic/s further states:\n\n- the application of gdocps may vary depending on the medium used to record the data (i.e. physical vs. electronic records), but the principles are applicable to both...\n- some key concepts of gdocps are summarised by the acronym alcoa...\n\nalong with the additional key concepts described by alcoa+, pic/s goes on to state:\n\n- together, these expectations ensure that events are properly documented and the data can be used to support informed decisions.\n\ntables 1.1 and table 1.2 provide information of how key concepts described by alcoa and alcoa+ should be applied throughout the data life cycle.\n\n|table 1.1: alcoa|\n|---|\n|principle|data expectation|\n|attributable|- attributable to the person or system generating the data\n- identify the person or system performing an activity that creates or modifies data\n- linked to the source of the data\n|\n|legible|- readable and permanent\n- accessible throughout the data life cycle\n- original data and any subsequent modifications are not obscured\n|\n|contemporaneous|recorded or observed at the time the activity is performed|\n|original|original data is the first recording of data, or a \"true copy\" which preserves content or meaning|\n|accurate|- free from error\n- no editing performed without documented amendments\n- conforming to truth or standard\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e82b6d27-f513-4cb0-bfcd-ac1653a313fe": {"__data__": {"id_": "e82b6d27-f513-4cb0-bfcd-ac1653a313fe", "embedding": null, "metadata": {"page_label": "15", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Critical Thinking Strategies for Ensuring Data Integrity\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP(r) Guide define the expectations for data integrity in terms of completeness, consistency, endurance, and availability, and how do these principles contribute to ensuring data integrity in pharmaceutical manufacturing processes?\n\n2. In what ways does the ISPE Records and Data Integrity Guide suggest critical thinking skills can be applied to enhance data governance and integrity, and what specific elements of critical thinking are highlighted as beneficial for identifying and assessing risks to product quality and patient safety in the pharmaceutical industry?\n\n3. According to the ISPE Records and Data Integrity Guide, how can critical thinking facilitate the creation and application of new models and processes to meet evolving technical and regulatory needs in the pharmaceutical sector, and why is a product quality driven and patient-focused approach preferred over a purely document-driven and compliance-focused approach?", "prev_section_summary": "The section discusses the use of risk management principles to enhance the accuracy and integrity of primary records, emphasizing the preference for high resolution/dynamic data over low resolution/static data. It also explains the foundational principles of good GxP data management according to WHO Annex 5 and PIC/S guidance, highlighting the application of ALCOA and ALCOA+ principles throughout the data life cycle. The ALCOA principles (attributable, legible, contemporaneous, original, accurate) are detailed, along with expectations for each principle in ensuring proper documentation and support for informed decisions. Tables 1.1 and 1.2 provide information on how these key concepts should be applied throughout the data life cycle.", "excerpt_keywords": "ISPE, GAMP, data integrity, critical thinking, pharmaceutical manufacturing"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 13\n\n|principle|data expectation|\n|---|---|\n|complete|* all data, and relevant metadata, including any repeat or re-analysis performed|\n|consistent|* application of good documentation practices throughout any process * the application of date and time stamps in the expected sequence|\n|enduring|* recorded in a permanent, maintainable form for the retention period|\n|available|* available and accessible for review, audit, or inspection throughout the retention period|\n\n1.5.5 critical thinking\n\ncritical thinking is a systematic, rational, and disciplined process of evaluating information from a variety of perspectives to yield a balanced and well-reasoned answer. critical thinking allows the effective interpretation of data and situations while avoiding personal biases, assumptions, and other factors [14].\n\nthe application of critical thinking skills allows the identification of gaps in data governance and processes, and assists in challenging the effectiveness of behavioral, procedural, and technical controls in achieving data integrity. critical thinking is an important component of data integrity, and many regulators are trained in critical thinking to help them more quickly to identify and assess risk to product quality and patient safety. pic/s guidance [13] states:\n\n\"critical thinking skills should be used by inspectors to determine whether control and review procedures effectively achieve their desired outcomes.\"\n\nelements of critical thinking include:\n\n- analyzing situations through gathering relevant details, and reviewing them carefully and objectively through applying knowledge and experience\n- gathering and evaluating information from different sources, understanding links between concepts and ideas, and identifying inconsistencies and errors in reasoning\n- analyzing situations and solving problems consistently, systematically, and logically\n- evaluating information in an open-minded manner to better interpret and understand all available data and signals\n- challenging and questioning ideas and assumptions in a rational and balanced manner\n- comparing, contrasting, and testing alternatives based on ambiguous, incomplete, or partial information\n- creating, developing, and applying new models from experience\n- designing new processes to meet changing process, technical, and regulatory needs\n\ncritical thinking encourages a product quality driven and patient focused approach, rather than a document driven and purely compliance focused approach.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ad6d2f7b-d327-4a95-b069-8f175bc20f6f": {"__data__": {"id_": "ad6d2f7b-d327-4a95-b069-8f175bc20f6f", "embedding": null, "metadata": {"page_label": "16", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and System Life Cycle Compliance in Regulated Companies: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP(r) Guide propose regulated companies ensure data integrity through their computerized systems?\n \n2. What specific approach does the ISPE GAMP(r) Guide recommend for managing the life cycle of GxP computerized systems to maintain data integrity, and how does it suggest scaling these activities?\n\n3. In the context of ensuring data integrity in regulated companies, how does the ISPE GAMP(r) Guide differentiate between standalone GxP computerized systems and enterprise GxP computerized systems in terms of risk, impact, and management strategies?", "prev_section_summary": "The section discusses the ISPE GAMP(r) Guide's expectations for data integrity in terms of completeness, consistency, endurance, and availability. It also highlights the importance of critical thinking skills in enhancing data governance and integrity in pharmaceutical manufacturing processes. The section outlines elements of critical thinking such as analyzing situations objectively, gathering information from different sources, and creating new models and processes to meet evolving technical and regulatory needs. Critical thinking is emphasized as a key factor in identifying and assessing risks to product quality and patient safety, and promoting a product quality driven and patient-focused approach over a purely compliance-focused one.", "excerpt_keywords": "Keywords: ISPE, GAMP, data integrity, computerized systems, life cycle"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide:\n\nthis guide encourages regulated companies to apply critical thinking and promote its application through leadership, records and data integrity awareness, and training. this guide encourages the application of critical thinking as part of a holistic top-down risk-based approach. the identification of appropriate and effective controls within a specific process context, in accordance with an understanding of risks to patient and product, is encouraged.\n\n### 1.5.6 gxp computerized system life cycle\n\ndata integrity is underpinned by well-documented, validated gxp computerized systems, and the application of appropriate controls throughout both the system and data life cycles. multiple gxp computerized systems may be involved in supporting a data life cycle, as the data may be passed from system to system. to ensure data integrity all gxp computerized systems should be trustworthy and validated for intended use. a system life cycle approach, such as described in ispe gamp(r) 5 [3], should be applied to each gxp computerized system. record and data integrity should be built-in and maintained throughout the gxp computerized system life cycle phases, from concept through project and operations, to retirement. the gxp computerized system life cycle activities should be scaled based on the complexity and novelty of the system, and potential impact on product quality, patient safety, and data integrity. the inherent risk of standalone gxp computerized systems may be greater than for enterprise gxp computerized systems. standalone gxp computerized systems may require different approaches. there should be sufficient effort placed on identifying and managing standalone systems, based on the level of impact and vulnerability, when designing controls, defining the data life cycle, and applying the data governance framework.\n\n### 1.5.7 summary of the key concepts\n\n|data and system life cycle|governance and management|procedural and technical controls|human factors|\n|---|---|---|---|\n|critical thinking; risk management; and alcoa+| | | |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "03f174e8-d52f-4363-9e75-e4e90aa599fd": {"__data__": {"id_": "03f174e8-d52f-4363-9e75-e4e90aa599fd", "embedding": null, "metadata": {"page_label": "17", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Management and Integrity in Regulated Environments: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What is the role of a data governance framework in managing data integrity within the ISPE GAMP(r) guide's recommendations?\n \n2. How does the ISPE GAMP(r) guide define \"metadata\" and its importance in the context of regulated data management and integrity?\n\n3. What examples does the ISPE GAMP(r) guide provide to illustrate atypical, aberrant, or anomalous results in the context of data integrity and management?", "prev_section_summary": "The section discusses the importance of data integrity in regulated companies and how the ISPE GAMP(r) Guide recommends ensuring data integrity through critical thinking, risk-based approaches, and appropriate controls in GxP computerized systems. It emphasizes the need for well-documented, validated systems with built-in integrity throughout the system life cycle phases. The guide suggests scaling activities based on system complexity and potential impact on product quality, patient safety, and data integrity. It also differentiates between standalone and enterprise GxP computerized systems in terms of risk, impact, and management strategies, highlighting the need for tailored approaches for standalone systems. The key concepts include critical thinking, risk management, ALCOA+ principles, governance, procedural and technical controls, and human factors.", "excerpt_keywords": "ISPE, GAMP, data integrity, metadata, regulated data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 15\n\nrecords and data integrity\n\ndata is managed through a controlled data management life cycle within a data governance framework. a holistic risk management approach is applied to manage risks to data integrity and to ensure that the principles of alcoa+ are met. the data life cycle may be supported by one or more computerized systems that should be trustworthy and compliant.\n\n### key terms\n\n|regulated data|information used for a regulated purpose or to support a regulated process.|\n|---|---|\n|metadata|\"metadata is data that describes the attributes of other data, and provide context and meaning. typically, these are data that describe the structure, data elements, inter-relationships and other characteristics of data.\" [1] \"it also permits data to be attributable to an individual (or if automatically generated, to the original data source).\" [8]|\n|regulated record|a collection of regulated data (and any metadata necessary to provide meaning and context) with a specific gxp purpose, content, and meaning, and required by gxp regulations. records include instructions as well as data and reports.|\n|atypical / aberrant / anomalous result|\"results that are still within specification but are unexpected, questionable, irregular, deviant or abnormal. examples would be chromatograms that show unexpected peaks, unexpected results for stability test point, etc.\"|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "fe66d819-f10e-4e0f-a84c-876bcb2be3e2": {"__data__": {"id_": "fe66d819-f10e-4e0f-a84c-876bcb2be3e2", "embedding": null, "metadata": {"page_label": "18", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Empty Canvas: A Collection of Absence\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer's PharmaWise CSV & Data Integrity directory on Google Drive?**\n - This question is specific to the metadata provided in the context, particularly regarding the file size of the document in question. The context directly states the file size as 6245344 bytes, which is a detail unlikely to be found in other sources.\n\n2. **What is the creation and last modification dates of the document titled \"Empty Canvas: A Collection of Absence\" found within the ISPE Records and Data Integrity Guide?**\n - The context provides unique information about the creation date (2024-04-07) and the last modified date (2024-04-04) of the document. This information is specific to the document's version stored in the mentioned directory and would be difficult to find elsewhere, especially since it pertains to internal document management details.\n\n3. **What is the discrepancy between the document title and its content as observed in the ISPE Records and Data Integrity Guide's specific section labeled \"Empty Canvas: A Collection of Absence\"?**\n - Given the document title \"Empty Canvas: A Collection of Absence\" and the excerpt indicating \"no_content_here,\" this question probes into the nature of the document's content or the intention behind its title and the apparent lack of content in the provided excerpt. This question seeks to understand the rationale or the thematic significance behind such a discrepancy, which is a detail that would be uniquely answered by the context or the document itself.\n\nThese questions are tailored to extract information that is uniquely available from the provided context, focusing on the document's metadata, content peculiarities, and specific document management details that are unlikely to be documented or discussed in other sources.", "prev_section_summary": "The section discusses the management of data integrity within a data governance framework as outlined in the ISPE GAMP(r) guide. It emphasizes the importance of a controlled data management life cycle and a holistic risk management approach to ensure compliance with the principles of ALCOA+. Key terms such as regulated data, metadata, regulated record, and atypical/aberrant/anomalous results are defined and explained in the context of regulated data management. The section also provides examples of atypical results to illustrate the concept of data integrity and management.", "excerpt_keywords": "Keywords: ISPE, Records, Data Integrity, Guide, Metadata"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "14ec4584-b2bd-49af-af46-5afecab6b2fb": {"__data__": {"id_": "14ec4584-b2bd-49af-af46-5afecab6b2fb", "embedding": null, "metadata": {"page_label": "19", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Regulatory Focus on Data Integrity in GxP Records: Requirements for Compliance", "questions_this_excerpt_can_answer": "1. What specific areas of concern have regulatory bodies focused on regarding data integrity in GxP records, as outlined in the ISPE Records and Data Integrity Guide?\n \n2. How does the FDA suggest firms should approach the management of data integrity risks, according to the document \"Regulatory Focus on Data Integrity in GxP Records: Requirements for Compliance\"?\n\n3. What principles are recommended for ensuring the integrity and quality of data throughout its life cycle, as mentioned in the ISPE guide, and how does MHRA's stance on data governance integrate into the pharmaceutical quality system?", "prev_section_summary": "The section provides metadata details about a document titled \"Empty Canvas: A Collection of Absence\" stored in the ISPE Records and Data Integrity Guide. It includes information such as the file size, creation date, and last modification date of the document. The excerpt from the document itself indicates the absence of content in this specific section. The key topics revolve around document management, metadata, and the discrepancy between the document title and its content.", "excerpt_keywords": "ISPE, Records, Data Integrity, Regulatory Focus, GxP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n2 regulatory focus\n\n2.1 introduction\n\nit is a regulatory expectation that gxp data and records are complete, consistent, reliable, accurate, that their content and meaning are preserved, and that they are available and usable for the required retention period.\n\nspecific areas impacting data integrity, which have been of particular regulatory focus and concern include:\n\n- lack of basic access control and security measures allowing unauthorized changes\n- shared user logins\n- missing or disabled audit trails\n- lack of contemporaneous recording of activities\n- failure to investigate data discrepancies\n- testing into compliance\n- incomplete collection, retention, and review of data for quality decisions\n- overwriting or deletion of original data\n- data falsification\n\nthe fda [9] states:\n\n\"cgmp regulations and guidance allow for flexible and risk-based strategies to prevent and detect data integrity issues. firms should implement meaningful and effective strategies to manage their data integrity risks based upon their process understanding and knowledge management of technologies and business models.\"\n\n2.2 data integrity requirements\n\nthis section provides an overview of the data integrity expectations on regulated companies based on published regulatory guidance documents. it is an overview and should not be considered as all inclusive.\n\nregulated companies should have confidence in the quality and the integrity of the data used to make decisions impacting product quality and patient safety.\n\ndata integrity controls for records should ensure that the accuracy, completeness, content, and meaning of data is maintained throughout the data life cycle. the principles of alcoa+ should be applied.\n\nmhra states:\n\n\"the effort and resource applied to assure the validity and integrity of the data should be commensurate with the risk and impact of a data integrity failure to the patient or environment.\" [8]\n\n\"the data governance system should be integral to the pharmaceutical quality system...\" [1]", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1a7e330d-7fa3-4a04-baa0-736c1dbe2f14": {"__data__": {"id_": "1a7e330d-7fa3-4a04-baa0-736c1dbe2f14", "embedding": null, "metadata": {"page_label": "20", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Securing Data Integrity: A Comprehensive Approach through Holistic Strategies and Behavioral Steps\"", "questions_this_excerpt_can_answer": "1. What holistic strategies do regulated companies need to implement to manage risks to data integrity, and how should these strategies be based on their understanding of processes and technologies?\n \n2. How does the ISPE GAMP\u00ae Guide suggest senior management should contribute to maintaining data integrity within their organizations, and what specific steps should they take to ensure the allocation of necessary resources for addressing identified limitations?\n\n3. What are the three types of steps, as outlined in the ISPE GAMP\u00ae Guide, that regulated companies should take to achieve an acceptable level of data integrity, and what specific actions should be taken under the behavioral steps to foster a culture supporting data integrity?", "prev_section_summary": "The section focuses on regulatory expectations for data integrity in GxP records, highlighting specific areas of concern such as access control, audit trails, and data falsification. It discusses the FDA's flexible approach to managing data integrity risks and emphasizes the importance of maintaining the accuracy and completeness of data throughout its life cycle. The section also mentions the principles of ALCOA+ and the MHRA's stance on data governance within the pharmaceutical quality system. Key entities mentioned include the FDA, MHRA, and regulated companies.", "excerpt_keywords": "Data Integrity, Holistic Strategies, ISPE GAMP Guide, Behavioral Steps, Senior Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\nregulated companies should implement meaningful and effective holistic strategies to manage risks to data integrity, based upon their process understanding and knowledge of technologies. critical analysis skills should be applied to identify and adequately control risks to data integrity, and to investigate and address root causes if failures occur. the impact on quality may be determined by considering the type of decisions or activities influenced by the data. risk to data reflects its vulnerability to unauthorized or inadvertent deletion or amendment, and the opportunity for detection during routine review. risk to data is typically increased by complex, inconsistent processes, with open-ended outcomes that are open to subjective interpretation, compared to simple tasks that are consistent, well defined, and objective [8].\n\nregulated companies should maintain an inventory of systems generating and maintaining data and the capability of each system. an inventory of documents should be maintained within the quality management system (qms) [6]. data governance activities should be integral to the qms [1]. data governance activities should be supported by an organizational culture that enforces data integrity, led by senior management leadership and behavior. senior management should:\n\n- take primary responsibility for data integrity by initially understanding the capabilities and limitations of existing processes, methods, environment, personnel, and technologies\n- subsequently ensure the allocation of necessary resources to address any identified limitations and maintain data integrity\n\nas part of data governance, regulated companies should take the following three types of steps to achieve an acceptable level of data integrity:\n\n- behavioral\n- procedural\n- technical\n\n### behavioral steps\n\nsenior management should take responsibility for establishing and maintaining a culture which supports data integrity. data governance should address the:\n\n- ownership of data throughout the data life cycle\n- training of personnel in the importance of data integrity principles [8]\n- creation of a working environment that:\n- enables visibility of errors, omissions, and atypical (aberrant) results [8]\n- encourages transparent investigation and analysis\n\npersonnel should be trained in the importance of data integrity, and in the methods of detecting data integrity issues, as part of routine gxp training programs. see section 3.3.7.\n\nsenior management should also establish arrangements to ensure personnel are not subject to commercial, political, financial, and other pressures or conflicts of interest that may adversely affect the quality of their work and integrity of their data [12].", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "78960670-c5a1-415b-a044-c2235af43c69": {"__data__": {"id_": "78960670-c5a1-415b-a044-c2235af43c69", "embedding": null, "metadata": {"page_label": "21", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Quality Management in Regulated Companies: Best Practices and Strategies", "questions_this_excerpt_can_answer": "1. What specific strategies does the document recommend for minimizing potential risks to data integrity in regulated companies, and how does it suggest handling residual risks?\n \n2. How does the document propose regulated companies should manage and control the use of blank paper templates for original data recording to ensure data integrity and traceability?\n\n3. In what ways does the document suggest regulated companies should adapt their quality management and data governance approaches to account for cultural differences and dynamics that may affect the societal acceptability of open reporting of problems and challenging of hierarchy?", "prev_section_summary": "The section discusses the importance of implementing holistic strategies to manage risks to data integrity in regulated companies. It emphasizes the need for a thorough understanding of processes and technologies, as well as critical analysis skills to identify and control risks. The role of senior management in maintaining data integrity is highlighted, with specific steps outlined for allocating resources and fostering a culture supporting data integrity. The section also covers the three types of steps (behavioral, procedural, technical) that regulated companies should take to achieve an acceptable level of data integrity, with a focus on the behavioral steps such as ownership of data, personnel training, and creating a transparent working environment. Additionally, the importance of training personnel in data integrity principles and detecting issues is emphasized, along with the need to prevent conflicts of interest that could impact data integrity.", "excerpt_keywords": "Data Integrity, Quality Management, Regulated Companies, Risk Management, Cultural Differences"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nthe quality manual, or equivalent document, should include a quality policy statement of managements commitment to an effective qms. the policy statement should include a code of ethics and a code of proper conduct which are intended to assure the reliability and completeness of data, including mechanisms for personnel to report any quality and compliance questions or concerns to management [12].\n\nimplementation of an effective quality culture and data governance may be different in different locations. a single approach to quality management or data governance may not be effective in all situations, e.g., if cultural differences and dynamics challenge the societal acceptability of open reporting of problems, and challenging of hierarchy.\n\n### procedural steps\n\nregulated companies should implement systems and procedures to:\n\n- to minimize the potential risk to data integrity [1]\n- identify the residual risk, using risk management techniques following the principles of ich q9 [10, 1]\n- assess risks associated with any third parties. this should include:\n- consideration of the control of associated risks\n- how the risks may be addressed in contracts and quality/technical agreements\n\ndata which is manually recorded requires a high level of supervision. supervisory measures or technical controls should be considered to reduce risk. examples include second person verification at the same time as the data entry is made or cross checks of related information sources [8].\n\nrecords should be accessible at locations where regulated activities take place. ad hoc data recording and later transcription to official records should be discouraged and should not be necessary [1].\n\naccess to blank paper templates for original data recording should be controlled, where appropriate. reconciliation of blank paper templates and documentation may be necessary to prevent recreation of a record without control and traceability.\n\nblank forms (including worksheets, laboratory notebooks, and master production control records) should be controlled by the quality unit or by another document control method. for example, numbered sets of blank forms may be issued, as appropriate. the numbering of completed forms should be compared to the numbering of all issued blank forms to ensure that they match. incomplete or erroneous forms should be kept as part of the permanent record along with written justification for their replacement [9].\n\nregulated companies should allow and encourage correct performance of tasks and accurate recording of data, as required; regulated companies should control physical aspects such as space, equipment, and the timing of events to support these activities. regulated records and data (including hybrid situations) should be held in secured areas. appropriate access should be provided to all relevant data for personnel performing data checking activities.\n\n### technical steps\n\ngxp computerized systems should be validated for intended use, supporting infrastructure qualified, and equipment qualified and calibrated as necessary.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d322710e-a555-461c-9444-6f0d24cae694": {"__data__": {"id_": "d322710e-a555-461c-9444-6f0d24cae694", "embedding": null, "metadata": {"page_label": "22", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Security in Regulated Environments: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures does the ISPE GAMP\u00ae Guide recommend for ensuring that access to regulated data and records is appropriately controlled and traceable to individual users within regulated companies?\n \n2. How does the ISPE GAMP\u00ae Guide suggest handling the assignment of elevated privileges, such as the ability to delete data or amend databases, to prevent conflicts of interest in data management within regulated environments?\n\n3. According to the ISPE GAMP\u00ae Guide, what considerations should be made regarding the use of audit trails in the management of regulated records, and how should alterations to these records be documented to maintain compliance with regulatory expectations?", "prev_section_summary": "The section discusses the importance of records and data integrity in regulated companies, emphasizing the need for a quality policy statement, code of ethics, and code of proper conduct to ensure reliability and completeness of data. It also highlights the importance of implementing effective quality culture and data governance, considering cultural differences and dynamics that may affect reporting and hierarchy challenges. The section outlines procedural steps for minimizing potential risks to data integrity, assessing residual risks, and controlling access to blank paper templates for data recording. It also emphasizes the need for supervision of manually recorded data, controlled access to blank forms, and secure storage of regulated records and data. Additionally, it mentions the validation of computerized systems and qualification of supporting infrastructure and equipment for data integrity.", "excerpt_keywords": "ISPE, GAMP Guide, data integrity, regulated companies, audit trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\nregulated companies should apply access controls to ensure that people have access only to functionality that is appropriate for their job role, and that actions are attributable to a specific individual [1]. controls should prevent the unauthorized deletion or modification of regulated data and records, inside or outside the software application (e.g., limiting user rights). regulated companies should be able to demonstrate the access levels granted to individual staff members and ensure that historical information regarding user access level is available [1]. user access arrangements should prevent (or if necessary log) unauthorized data amendments.\n\nbasic access control measures should be established (e.g., unique usernames and private passwords, and policies and processes for their management). shared logins or generic user access should not be used as they do not allow for traceability to an individual. where the computerized system design supports individual user access, this function should be used [1].\n\nappropriate segregation of duties should be applied. account privileges should be limited to those required for individuals to perform their duties, e.g., users, supervisors, quality unit and administrators.\n\nelevated privileges permitting activities such as data deletion, database amendment, or system configuration changes (e.g., system administrator rights), should not be assigned to individuals with a direct interest in the data (e.g., data generation, or data review or approval) [1].\n\nwhere possible automated data capture techniques should be applied to minimize the risk of data transcription error. appropriately controlled and synchronized clocks should be available for recording timed events.\n\ncomputerized systems should be designed to ensure that the execution of critical steps is recorded at the same time as they are performed [8], and are individually traceable. the eu gmp annex 11 [7] states that:\n\n\"consideration should be given, based on a risk assessment, to building into the system the creation of a record of all gmp-relevant changes and deletions (a system generated \"audit trail\").\"\n\naudit trails are required when users create, modify, or delete regulated records during normal operation [16]. the reason for change or deletion of regulated data should be documented and be consistent with regulatory expectations. the eu gmp chapter 4 [6] states that:\n\n\"any alteration made to the entry on a document should be signed and dated; the alteration should permit the reading of the original information. where appropriate, the reason for the alteration should be recorded.\"\n\naudit trail functionality should be available, enabled, and verified. routine data review should include audit trail review where relevant.\n\ndata transfer or migration should be designed and validated to ensure that data integrity principles are maintained. copies of records should preserve the integrity (accuracy, completeness, content, and meaning) of the original record. backup and recovery processes should be validated and periodically tested. security controls should be in place to ensure the data integrity of the record throughout the retention period. security controls should be validated, where appropriate.\n\narchival arrangements should be established for the long-term retention of regulated data (this should be accurate and complete) in compliance with legislation. the procedures for destruction of data should consider data criticality and any applicable regulatory or legal requirements [1].\n\nwhere outsourced or cloud services are used, attention should be paid to understanding the service provided, ownership, retrieval, retention and security of data [8], and the role quality agreements can play in such understanding. the physical location where the data is held, including the impact of any laws applicable to that geographic location, should be considered. the responsibilities of the service provider (particularly quality related aspects and requirements) should be defined in a technical/quality agreement or contract [8].", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a6a154e5-1ab2-4f74-94f5-d2c0749f65a6": {"__data__": {"id_": "a6a154e5-1ab2-4f74-94f5-d2c0749f65a6", "embedding": null, "metadata": {"page_label": "23", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance Framework for Ensuring Data Integrity and Records Management", "questions_this_excerpt_can_answer": "1. What specific definition of data governance is referenced in the ISPE Records and Data Integrity Guide, and which organization provided this definition?\n \n2. How does the ISPE Records and Data Integrity Guide categorize the elements within the data governance framework, and what are the main categories outlined?\n\n3. According to the ISPE Records and Data Integrity Guide, what role does human factors play in ensuring data integrity within the context of a data governance framework?", "prev_section_summary": "The section discusses the importance of data integrity and security in regulated environments, as outlined in the ISPE GAMP\u00ae Guide. Key topics include access controls, segregation of duties, audit trails, data capture techniques, data transfer and migration, backup and recovery processes, security controls, archival arrangements, and considerations for outsourced or cloud services. Entities mentioned include regulated companies, individuals with elevated privileges, system administrators, users, supervisors, quality unit, administrators, service providers, and data.", "excerpt_keywords": "Data governance, Records management, Data integrity, ISPE, Data life cycle"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n### records and data integrity\n\n### data governance framework\n\nintroduction\nthis section describes a framework for data governance, covering:\n- definition and overview of data governance\n- elements of data governance\n- importance of human factors in data integrity\n- maturity levels for data governance\n\n### overview\n\ndata governance may be defined as (mhra, 2015 [1]):\n\n\"the sum total of arrangements to ensure that data, irrespective of the format in which it is generated, is recorded, processed, retained and used to ensure a complete, consistent and accurate record throughout the data life cycle.\"\n\ndata governance ensures formal management of records and data throughout the regulated company. data governance encompasses the people, processes, and technology required to achieve consistent, accurate, and effective data handling. data governance provides the structure within which appropriate decisions regarding data related matters may be made according to agreed models, principles, processes, and defined authority. it may also be considered as a quality assurance and control approach for applying rigor and discipline to the process of managing, using, protecting, and improving organizational information.\n\n|figure 3.1: elements of the data governance framework|\n|---|\n|management systems and governance|\n|behavioral controls|people|\n|procedural controls|process|\n|technology controls|technical controls|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d7c66be3-577a-4677-82ba-ee3dd8958744": {"__data__": {"id_": "d7c66be3-577a-4677-82ba-ee3dd8958744", "embedding": null, "metadata": {"page_label": "24", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance and Data Integrity in Regulated Companies: Ensuring Compliance and Security", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest regulated companies should integrate data governance activities within their existing quality management systems (QMS)?\n \n2. What role does senior management play in the effectiveness of a data governance framework according to the ISPE GAMP\u00ae Guide, and what specific actions are recommended to demonstrate their commitment?\n\n3. According to the ISPE GAMP\u00ae Guide, how should regulated companies manage the risk to data integrity associated with outsourcing activities or the use of service providers, and what specific measures are recommended to ensure compliance and integrity in these scenarios?", "prev_section_summary": "The section discusses the data governance framework outlined in the ISPE Records and Data Integrity Guide. It defines data governance as arrangements to ensure complete, consistent, and accurate data throughout its lifecycle. The framework includes elements such as management systems, behavioral controls, procedural controls, and technology controls. Human factors are highlighted as important in ensuring data integrity within the framework. The section emphasizes the importance of formal management of records and data, encompassing people, processes, and technology to achieve consistent and effective data handling.", "excerpt_keywords": "ISPE GAMP Guide, data governance, data integrity, regulated companies, senior management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\nelements of data governance are closely related to regulatory requirements. data governance activities should be integral to the qms [1]. a specific corporate data integrity program may be necessary to address any gaps in a regulated companys existing quality management system.\n\nsenior management commitment and leadership is considered essential to the effectiveness of the data governance framework. lack of explicit and demonstrable senior management commitment risks ineffective data governance.\n\ntraining should be provided for personnel on the importance of data integrity principles and policies [8]. all personnel should be encouraged to report instances of data integrity failures, bad practice, or falsification, without fear of penalty. such reports should be fully and transparently investigated by senior management, including root cause analysis and the establishment of prevention or detection measures.\n\ndata governance should also address data ownership and responsibilities throughout the data life cycle [1]. individuals accountable and responsible for specific data, and its integrity and compliance, at various stages of the data life cycle should be defined and documented.\n\nregulated companies may recognize the need to manage data as a corporate asset. an executive level role, such as a chief data officer (cdo) may be appointed to oversee this area.\n\nthe specification, design, validation, and operation of processes and systems should meet the defined requirements for regulated data integrity. this should include ensuring appropriate control over intentional, unintentional, authorized, and unauthorized changes to regulated data.\n\nthe risk to data integrity, especially as it may be related to risk to product quality and product safety should be managed by an established quality risk management process, and defined as part of the qms. this should include consideration of the risk to data integrity associated with any outsourcing of activities or use of service providers, which should be assessed and managed through appropriate formal agreements. risk associated with use of summary reports or data should be considered, and this may include a review of the mechanisms used to generate and distribute summary data and reports, based on risk.\n\nthe data governance approach should be holistic, proportionate, and integrated (mhra, 2015 [1]):\n\n\"the data governance system should be integral to the pharmaceutical quality system described in eu gmp chapter 1. the effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality assurance resource demands. as such, manufacturers and analytical laboratories are not expected to implement a forensic approach to data checking on a routine basis, but instead design and operate a system which provides an acceptable state of control based on the data integrity risk, and which is fully documented with supporting rationale.\"\n\nmanual systems and paper-based records may be a key area of data integrity failure. risks associated with manual systems, including risks at the interface between manual and computerized systems, uncontrolled copies, and multiple inconsistent copies, should also be considered. computerized systems related activities are only one part of the broader governance framework, and equivalent considerations are required for paper-based systems and processes. see appendix o2.\n\nhuman factors are a critical aspect of an effective data governance framework, including the topics of cultural differences, human error, understanding and awareness, and motivation and behavior. see section 3.4 and appendix m3.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0aa15273-cc8b-42e0-bb59-170ee61d9ca9": {"__data__": {"id_": "0aa15273-cc8b-42e0-bb59-170ee61d9ca9", "embedding": null, "metadata": {"page_label": "25", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance Framework for Ensuring Data Integrity", "questions_this_excerpt_can_answer": "1. What are the key elements that constitute the data governance framework as outlined in the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest a regulated company should approach establishing a data integrity program, including the roles of senior management and the importance of training?\n\n3. What specific strategies does the document recommend for managing the data life cycle within the context of ensuring data integrity, including the role of quality risk management and system access security?", "prev_section_summary": "This section discusses the importance of data governance and data integrity in regulated companies, as outlined in the ISPE GAMP\u00ae Guide. Key topics include the integration of data governance activities within existing quality management systems, the role of senior management in ensuring effectiveness of data governance, managing risks associated with outsourcing activities or service providers, data ownership and responsibilities throughout the data life cycle, and the need for a holistic and proportionate data governance approach. The section also emphasizes the importance of training personnel on data integrity principles, transparent investigation of data integrity failures, and the role of human factors in an effective data governance framework. Additionally, it highlights the need for appropriate control over intentional and unintentional changes to regulated data, risk management processes, and considerations for manual and paper-based systems in ensuring data integrity.", "excerpt_keywords": "Data Governance, Data Integrity, ISPE, Quality Risk Management, System Access Security"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n3.3 elements of the data governance framework\n\nelements of pe data governance framework\ngoals and objectives\norganization and data ownership\n- leadership and management responsibility\n- roles and responsibilities\n- policies and standards\n- awareness and training\nstrategic planning and data integrity program\ndata life cycle\n- quality risk management\n- data management\n- data incident and problem management\n- system access and security management\nsupporting processes\n- auditing\n- metrics\n- classification\n- validation\nit architecture and infrastructure\nmaturity level model\n\nthe regulated company should define clear data governance goals and objectives, supported by a specific data integrity program, if necessary. senior management should establish appropriate organizational structures to achieve these objectives, with clear policies, standards, and procedures, supported by appropriate training. appropriate controls should be applied throughout the data life cycle, based on quality risk management. suitable supporting processes (including system validation) and suitable it architecture and infrastructure should be established. see figure 3.2.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e7db645d-5c19-4cb0-8df7-d6869ef35dee": {"__data__": {"id_": "e7db645d-5c19-4cb0-8df7-d6869ef35dee", "embedding": null, "metadata": {"page_label": "26", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance Framework for Regulated Companies: Ensuring Compliance, Data Quality, and Risk Management", "questions_this_excerpt_can_answer": "1. What are the general goals and objectives that regulated companies should aim for in their data governance frameworks according to the ISPE GAMP\u00ae Guide: Records and Data Integrity?\n\n2. How does the ISPE GAMP\u00ae Guide suggest regulated companies should adapt their data governance activities based on their specific business context and the nature of their operations?\n\n3. What are the key elements and supporting processes outlined in the Data Governance Framework for Regulated Companies to ensure compliance, data quality, and risk management as per the document titled \"Data Governance Framework for Regulated Companies: Ensuring Compliance, Data Quality, and Risk Management\"?", "prev_section_summary": "The section discusses the key elements of the data governance framework outlined in the ISPE Records and Data Integrity Guide, including goals and objectives, organization and data ownership, strategic planning, data life cycle management, supporting processes, IT architecture, and infrastructure. It emphasizes the importance of senior management's leadership and responsibility, roles and responsibilities, policies and standards, awareness and training, quality risk management, system access security, auditing, metrics, classification, validation, and maturity level model in ensuring data integrity within a regulated company.", "excerpt_keywords": "ISPE, GAMP, Data Governance, Compliance, Risk Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n|figure 3.2: data governance framework|\n|---|\n|organization leadership culture and human factors responsibilities ownership awareness policies training procedures controls behavioral procedural technical supporting processes auditing metrics classification validation it architecture business processes it infrastructure|\n\n### scope and objectives\n\neffective data governance requires regulated companies to be clear on the scope and objectives. general goals and objectives for data governance may include:\n\n- increasing consistency and confidence in decision making\n- decreasing compliance risk\n- improving data security and privacy\n- maximizing the potential business value of data\n- clarifying information ownership and accountability for data quality\n- minimizing or eliminating re-work\n- optimizing process effectiveness\n\ndata governance activities may have a different scope and objective depending on the nature, situation, and the business context of the regulated company, and it is likely that some focus and prioritization will be required. although the overall structures and concepts in this guide would be appropriate for many regulated companies, the discussion will concentrate on focus areas for a regulated company, specifically around compliance, regulated data quality, and managing risks to data integrity, and therefore, product quality and patient safety.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8a7c5ca6-d226-4a7d-8fdb-d806f4c5a395": {"__data__": {"id_": "8a7c5ca6-d226-4a7d-8fdb-d806f4c5a395", "embedding": null, "metadata": {"page_label": "27", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance and Leadership in Regulated Companies: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific roles does senior management play in ensuring data integrity within regulated companies, according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest regulated companies should approach the creation and function of a data governance council to enhance data integrity?\n\n3. What are the outlined goals and objectives for data governance in regulated companies as detailed in the ISPE Records and Data Integrity Guide, and how do these objectives contribute to product quality and patient safety?", "prev_section_summary": "The section discusses the Data Governance Framework for Regulated Companies as outlined in the ISPE GAMP\u00ae Guide: Records and Data Integrity. It emphasizes the importance of clear scope and objectives for data governance, such as increasing consistency in decision making, decreasing compliance risk, and improving data security. The framework includes elements like organization leadership, culture, responsibilities, policies, training, procedures, controls, auditing, metrics, validation, IT architecture, business processes, and IT infrastructure. The document highlights the need for tailored data governance activities based on the specific business context and operations of regulated companies to ensure compliance, data quality, and risk management for product quality and patient safety.", "excerpt_keywords": "Data Governance, Leadership, Regulated Companies, Data Integrity, ISPE Guide"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n### records and data integrity\n\nspecific data governance goals and objectives for a regulated company may include:\n\n- demonstrating fitness for intended use through computerized system validation\n- assessing and controlling regulated data integrity risk\n- effective compliance with gxp regulations\n- minimizing inspection risk\n- compliance with various data privacy laws and regulations\n- ensuring adequate data security and access control\n- achieving these objectives effectively throughout a wide range of sites encompassing many local cultures and circumstances\n\nfor a regulated company, the key objectives should be product quality and patient safety, for which appropriate data governance delivering acceptable data integrity is considered a prerequisite.\n\ndata governance goals, objectives, and scope should be defined and communicated by senior management, based on significant input from business process owners, and quality unit and information technology functions.\n\n### leadership and management responsibility\n\nsenior management with executive authority has a responsibility across all levels of the regulated company to:\n\n- promote the requirements for data integrity\n- provide appropriate resources\n- resolve issues\n- define priorities\n- ensure that data integrity expectations are achieved\n\nsenior management should also make personnel aware of the relevance of data integrity and the importance of their role in protecting the safety of patients and the reputation of the organization.\n\nsenior management should lead by example and reinforce the messages by positive action, rewarding appropriate behavior, and taking the necessary management action when data integrity expectations and policies are not met.\n\nit may be helpful to create a data governance council, or equivalent, ensuring adequate input from business process owners, qu, and it.\n\na data governance council, or equivalent, could play a key role in:\n\ndefining policies\ntaking decisions on roles and accountabilities\nleading initiatives aimed at raising awareness", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b7099f3c-5f0b-4492-a383-b481c98fef25": {"__data__": {"id_": "b7099f3c-5f0b-4492-a383-b481c98fef25", "embedding": null, "metadata": {"page_label": "28", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance and Ownership in Regulated Companies: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide define the roles of \"process owner\" and \"system owner\" in the context of regulated computerized systems, and how might these definitions align or differ from those provided in EU GMP Annex 11?\n\n2. What are the recommended practices for establishing and monitoring Key Performance Indicators (KPIs) in data integrity programs within regulated companies, according to the ISPE Records and Data Integrity Guide?\n\n3. In the framework of data governance and ownership in regulated companies, what is the advised approach for delineating responsibilities and ensuring the integrity and compliance of data throughout its lifecycle, as outlined in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the importance of data governance and leadership in regulated companies, focusing on specific goals and objectives for ensuring data integrity. Key topics include demonstrating fitness for intended use, assessing and controlling data integrity risk, compliance with regulations, data security, and access control. Senior management is highlighted as having a crucial role in promoting data integrity, providing resources, resolving issues, and defining priorities. The section also emphasizes the need for a data governance council to define policies, roles, and accountabilities, and lead initiatives to raise awareness about data integrity. Overall, the main entities involved in this section are senior management, business process owners, quality unit, and information technology functions.", "excerpt_keywords": "ISPE, GAMP, data integrity, data governance, regulated companies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\ndealing with serious data related problems or incidents such a body would typically be led by a member of executive management, e.g., the chief data officer.\n\n### organization and data ownership\n\ndata ownership and responsibilities should be defined in the data governance framework and wider qms. individuals accountable and responsible for specific data, and its integrity and compliance, at various stages of the data life cycle should be defined and documented.\n\ndata governance should not be regarded as primarily an it issue, and it is primarily a supporting role. while it may be an important role for computerized systems, it has typically no involvement for non-electronic records and data. effective data governance in regulated companies requires communication and co-operation between business process owners, quality assurance, it department, and other technical support departments such as engineering, with sufficient support and leadership from senior management.\n\n### key performance indicators\n\nleadership teams should monitor the progress of data integrity programs, covering assessment and remediation activities, as well as monitoring the changing risk profile as work proceeds. typically, a small number of key performance indicators (kpis) will be identified for tracking in a summary dashboard. the nature and content of kpi dashboards will vary depending on the specific needs, points of focus, and the priority of the data integrity work within the regulated company. data for the selected kpis should be readily available to avoid creating excessive work to collect supporting information. kpis should be a predictor of the challenges ahead and should not just be to show what has been achieved. kpis should be clearly defined to promote consistency when the supporting data is collected. ambiguous definitions can lead to misunderstandings when deciding exactly what to count and not count within the kpi being measured. kpis should not give an over optimistic or pessimistic view that could impede effective management. although leadership teams may like simple presentation of data (e.g., red, amber, green status) oversimplification should be avoided. reporting should be accurate and based on metrics. leadership should also be aware of different levels of risk tolerance across the regulated company.\n\n### roles and responsibilities\n\ntwo key roles associated with regulated computerized systems are defined in eu gmp annex 11 [7] and ispe gamp(r) 5 [3]:\n\n- process owner\n- system owner\n\nthe terms and definitions used in specific organizations and the boundaries between such roles may vary. the use of the terms in this guide and the role descriptions below are aligned with the definitions in ispe gamp(r) 5 [3] and eu gmp annex 11 [7].", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3e806ead-557a-4a2d-8b41-69e31ee84077": {"__data__": {"id_": "3e806ead-557a-4a2d-8b41-69e31ee84077", "embedding": null, "metadata": {"page_label": "29", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Records and Data Integrity Ownership and Management Guidelines", "questions_this_excerpt_can_answer": "1. What specific responsibilities does a process owner have in ensuring the compliance and fitness for intended use of a computerized system within a pharmaceutical environment, according to the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide differentiate between the roles and responsibilities of a process owner and a system owner in the context of managing data integrity and system maintenance in regulated pharmaceutical processes?\n\n3. According to the ISPE Records and Data Integrity Guide, what are the key activities a system owner must undertake to ensure the security, availability, and proper maintenance of a system, especially in the context of supporting regulated processes and maintaining regulated data and records?", "prev_section_summary": "This section discusses the importance of data governance and ownership in regulated companies, as outlined in the ISPE Records and Data Integrity Guide. It covers topics such as organization and data ownership, key performance indicators (KPIs) for monitoring data integrity programs, and roles and responsibilities, including those of process owner and system owner in regulated computerized systems. The section emphasizes the need for effective communication and cooperation between various departments and leadership teams to ensure data integrity and compliance throughout the data lifecycle.", "excerpt_keywords": "ISPE, Records, Data Integrity, Ownership, Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nprocess owner\n\nthis is the owner of the business process or processes being managed. the process owner is ultimately responsible for ensuring that the computerized system and its operation is in compliance and fit for intended use in accordance with applicable sops. the process owner may also be the system owner. the process owner may be the de facto owner of the data residing on the system (data owner) and therefore, ultimately responsible for the integrity of the data. process owners are typically the head of the functional unit using the system.\n\nspecific activities may include:\n\n- approval of key documentation as defined by plans and sops\n- providing adequate resources (personnel including smes, and financial resources) to support development and operation of the system\n- ensuring adequate training for end users\n- ensuring that sops required for operation of the system exist, are followed, and are reviewed periodically\n- ensuring changes (including business process changes) are approved and managed\n- reviewing assessment/audit reports, responding to findings, and taking appropriate actions to ensure gxp compliance\n- ensuring that processes/systems are fit for the intended business use, and support data integrity\n- ensuring that data integrity risks are identified and controlled to acceptable levels\n\nsystem owner\n\nthe system owner is responsible for the availability, and support and maintenance, of a system and for the security of the data residing on that system. the system owner is responsible for ensuring that the computerized system is supported and maintained in accordance with applicable sops. the system owner also may be the process owner (e.g., for it infrastructure systems or systems not directly supporting gxp).\n\nfor systems supporting regulated processes and maintaining regulated data and records the ownership of the data resides with the gxp process owner, not the system owner.\n\nthe system owner acts on behalf of the users. the system owner for larger systems will typically be from it or engineering functions. global it systems may have a global system owner and a local system owner to manage local implementation.\n\nspecific activities may include:\n\napproval of key documentation as defined by plans and sops\nensuring that sops required for the maintenance of the system exist and are followed\nensuring adequate training for maintenance and support staff\nensuring changes (including technical changes) are managed\nsystem life cycle management, including system upgrade and replacement planning", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8c51ab00-f84b-49b5-bd24-bf82a248d4df": {"__data__": {"id_": "8c51ab00-f84b-49b5-bd24-bf82a248d4df", "embedding": null, "metadata": {"page_label": "30", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Governance in Regulated Companies: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide define the role and responsibilities of a data steward in the context of data integrity and governance within regulated companies?\n \n2. What specific actions does the ISPE GAMP\u00ae Guide recommend for ensuring systems are maintained to support data integrity and are fit for their intended business use within regulated environments?\n\n3. According to the ISPE GAMP\u00ae Guide, what are the key components and processes that should be included in the quality management system (QMS) to uphold data integrity standards and avoid unnecessary duplication in regulated companies?", "prev_section_summary": "The section discusses the roles and responsibilities of process owners and system owners in ensuring data integrity and system maintenance in a pharmaceutical environment, as outlined in the ISPE Records and Data Integrity Guide. Process owners are responsible for ensuring compliance and fitness for use of computerized systems, while system owners are responsible for the availability, support, and maintenance of systems and data security. Specific activities for process owners include approving documentation, providing resources, ensuring training, and managing changes, while system owners focus on documentation approval, system maintenance, training for staff, and system life cycle management. The section emphasizes the importance of data integrity, gxp compliance, and identifying and controlling data integrity risks.", "excerpt_keywords": "ISPE, GAMP Guide, data steward, data integrity, data governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n* ensuring the availability of information for the system inventory and configuration management\n\n* providing adequate resources (personnel including smes, and financial resources) to support the system\n\n* reviewing audit reports, responding to findings, and taking appropriate actions to ensure gxp compliance, in conjunction with the process owner\n\n* ensuring that systems are supported and maintained such that they are fit for the intended business use, and support data integrity\n\n* ensuring that data integrity risks are identified and controlled to acceptable levels\n\ndata steward\n\nthe term data steward is usually used in the context of data governance. the term may be used differently and may apply to different roles in the data governance framework. it may be a functional role, or included as part of a wider job description. data stewardship activities may be embedded in the responsibilities of other roles, rather than being a new and specific individual role.\n\nin this guide a data steward is defined as a person with specific tactical coordination and implementation responsibilities for data integrity. a data steward is responsible for performing data usage, management, and security policies as determined by wider data governance initiatives, such as acting as a liaison between the it department and the business.\n\ndata stewards are typically members of the operational unit or department creating, maintaining, or using the data, e.g., personnel in the laboratories who generate, manage, and handle the data. segregation of duties should seek to minimize conflict of interest in the data steward role, e.g. avoiding the granting of unnecessary administrator privileges to individuals responsible for functional review and approval of gxp data.\n\npolicies and standards\n\ndata governance policies and standards should be established and communicated to all relevant staff. data policies define the expectations and rules covering the integrity, security, quality, and use of data during its life cycle. data standards provide more detail on structure, format, definition, and use of data.\n\nit should be clear who has responsibility for defining, reviewing, approving, and monitoring compliance with policies and standards. such policies and standards may be developed by a data governance council, or similar.\n\nbased on the policies and standards, practical procedures, typically in the form of sops should be established, defining key activities and processes related to data integrity and providing details on how to achieve the defined policies and standards. examples include procedures for handling adverse event and complaint data and evidence, manual chromatography integration practices, and batch record assembly and review.\n\nthese policies, standards, and procedures as described above should be incorporated as an integral part of the overall qms, and unnecessary duplication should be avoided.\n\nawareness and training\n\nregulated companies should ensure sufficient training in the importance of data integrity principles and data governance activities, and awareness and training on regulatory requirements and organizational policies and standards.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bc8ec215-4c2d-4d2f-9e62-177ec1ed63cc": {"__data__": {"id_": "bc8ec215-4c2d-4d2f-9e62-177ec1ed63cc", "embedding": null, "metadata": {"page_label": "31", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Building a Foundation for Effective Data Management: Establishing Data Integrity and Governance Programs\"", "questions_this_excerpt_can_answer": "1. What specific role does senior management play in the establishment and success of data integrity initiatives and programs within a regulated company, according to the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest technology and tools should be utilized to enhance data integrity and governance within an organization?\n\n3. What strategies does the ISPE Records and Data Integrity Guide recommend for ensuring effective communication and stakeholder engagement in the context of data integrity programs?", "prev_section_summary": "The section discusses the importance of data integrity and governance in regulated companies, as outlined in the ISPE GAMP\u00ae Guide. Key topics include the role of a data steward, establishment of data governance policies and standards, and the importance of awareness and training on data integrity principles and regulatory requirements. The section emphasizes the need for systems to be maintained to support data integrity, identification and control of data integrity risks, and the inclusion of data integrity standards in the quality management system to avoid duplication.", "excerpt_keywords": "ISPE, Records, Data Integrity, Governance, Technology"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nthe aim is to achieve a state where all staff routinely follow accepted data integrity principles and practices, from a position of awareness and understanding, rather than depending on policing and technical controls to prevent users from doing the wrong thing.\n\nfor further information on training see appendix m3.\n\n### technology and tools\n\ndata governance technology and tools may be used to automate the definition, management, and enforcement of business rules at the data level. technology may assist in improving data quality and fitness for intended use by providing tools for data standardization and cleansing. systems should be designed and configured to enforce integrity and consistency rules, ensuring conformance to defined policies and standards of the organization, and applying technical controls to minimize risks to data integrity. other tools may include data reporting and visualization tools.\n\n### strategic planning and data integrity program\n\ndata governance initiatives and programs should be strategic and high level to provide a clear vision and direction to the regulated company, while also ensuring that critical immediate actions are prioritized, facilitated, and delivered. effective data integrity initiatives and programs need senior management sponsorship at an appropriate level. short term needs should be addressed (particularly critical compliance requirements) while the wider aspects of data governance in the organization are being developed and overall maturity level increases. data governance programs should be scaled based on the size and complexity of the business unit, level of compliance risk, and potential impact on product quality and patient safety (mhra, 2015 [1]).\n\n\"the effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality assurance resource demands.\"\n\ncommunication should clearly link the data integrity program with immediate business objectives or regulatory compliance challenges and requirements, so that the value of the program is obvious to all stakeholders. a communication plan, and if necessary, a change management plan should be established to ensure ongoing stakeholder engagement and understanding, and a smooth transition to new ways of working. a process should be established for systematically incorporating learning points, and building them into the program and sharing with stakeholders. a repository of items such as templates, checklists, example citations, and faqs on an internal information sharing and collaboration site, or similar, should be considered.\n\nfor further information on corporate data integrity programs, see appendix m1. for further information on knowledge and information sharing, see appendix m3.\n\n### data life cycle and data management\n\nthe data life cycle should be defined in standards and procedures. see section 3.3.6.\n\nappropriate data management functions should implement the policies and standards established by the data governance framework, including:\n\ndata quality management\nmaster and reference data management", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "67a70364-8ea3-48e1-84fc-f1f5fe529b8f": {"__data__": {"id_": "67a70364-8ea3-48e1-84fc-f1f5fe529b8f", "embedding": null, "metadata": {"page_label": "32", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity: A Comprehensive Approach through Data Incident Management, Data Inventory Management, and Human Factors", "questions_this_excerpt_can_answer": "1. What specific models and diagrams does the ISPE GAMP\u00ae Guide suggest for defining data architecture within a regulated company to ensure data integrity, and what are their purposes in the context of data life cycle implementation?\n\n2. How does the ISPE GAMP\u00ae Guide propose to manage data quality, and what specific aspects of data does it emphasize should be addressed to ensure the data's fitness for its intended purpose within a specified business or regulatory process?\n\n3. According to the ISPE GAMP\u00ae Guide, how do human factors and cultural considerations play a critical role in maintaining data integrity, and what strategies does it recommend for management to foster an environment conducive to data integrity compliance?", "prev_section_summary": "The section discusses the importance of establishing data integrity principles and practices within a regulated company, emphasizing the role of technology and tools in enhancing data governance. It highlights the need for strategic planning and senior management sponsorship for effective data integrity programs, as well as the importance of communication and stakeholder engagement. The section also touches on the data life cycle and data management functions, such as data quality management and master data management.", "excerpt_keywords": "ISPE, GAMP Guide, Data Incident Management, Data Quality Management, Human Factors"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n* data incident management\n\n* data inventory management\n\nthese activities and responsibilities should be integrated into existing roles and functions, where possible. data architecture models (which may be conceptual, logical, or physical) may be defined in the form of data flow diagrams, entity relationship diagrams, or system architecture diagrams. these together with data standards and procedures define how the data life cycle should be implemented within the regulated company. higher level business process mapping and business process definition are prerequisites for the successful development of detailed data related flows and diagrams. see appendix d2.\n\ndata quality relates to the datas fitness to serve its intended purpose in a given context within a specified business or regulatory process. data quality management activities address aspects including accuracy, completeness, relevance, consistency, reliability, and accessibility. see section 1.5.4.\n\ndata quality management enforces the established standards to ensure that data meets the relevant business definitions and rules of the data governance framework. the data incident management process should ensure that data problems and errors are identified, evaluated, resolved, and closed in a timely manner. data should be safeguarded throughout its life cycle by appropriate system access and security management procedures. systems and procedures should be established to minimize the potential risk to data integrity, identify the residual risks, and apply the principles of ich q9 [10].\n\n### human factors in data integrity\n\nconsideration of various human factors is considered critical for effective data integrity. cultural considerations can refer to a corporate culture (the model within which an organization operates) or to a local geographic culture (the moral and behavioral norm within a country or region). openness and a willingness to discuss difficult situations can support an environment where failing results are seen as a group problem needing to be resolved. management should help employees to achieve the openness around data integrity that is needed for compliance. data integrity issues often arise from genuine human error; however, regulators do not distinguish between human error and data falsification when assessing the impact of a data integrity failure. personal gain or self-interest has been the motivator in several high profile fraud cases. the extent and impact of falsification can be magnified if collusion is involved, but geographic and corporate cultures can influence the degree to which collusion may be prevented. robust technical controls within all of the data generation, collection, processing or storage systems, coupled with effective data review processes, can reduce opportunities for fraud. the main foundation for a high level of data integrity is the knowledge and understanding of what data integrity is, the importance it has for an organization, and the personal role each employee has in protecting it. for further information see appendix m3.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ce1337e4-bc80-4895-a584-8071acb6cdd9": {"__data__": {"id_": "ce1337e4-bc80-4895-a584-8071acb6cdd9", "embedding": null, "metadata": {"page_label": "33", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Model for Regulated Companies: A Comprehensive Guide to Ensuring Compliance and Quality Assurance", "questions_this_excerpt_can_answer": "1. What is the recommended approach for regulated companies to ensure their processes and systems are aligned with data integrity requirements before utilization?\n \n2. How should regulated companies approach the acquisition of systems and technology to ensure they support adequate data integrity?\n\n3. Can you describe the structure or basis of the Data Integrity Maturity Model as outlined in the ISPE Records and Data Integrity Guide, specifically in appendix M2?", "prev_section_summary": "The section discusses the importance of data integrity in regulated companies, focusing on data incident management, data inventory management, and human factors. It emphasizes the need for integrating these activities into existing roles and functions, defining data architecture models, and implementing data standards and procedures. Data quality management is highlighted, addressing aspects such as accuracy, completeness, relevance, consistency, reliability, and accessibility. The role of human factors and cultural considerations in maintaining data integrity is also explored, with a focus on fostering an environment conducive to compliance and preventing data falsification through robust technical controls and effective data review processes.", "excerpt_keywords": "Data Integrity, Maturity Model, Regulated Companies, Compliance, Quality Assurance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n### records and data integrity\n\n3.5 data integrity maturity model\n\nregulated companies should focus on modifying their processes and systems to use appropriate available technical controls, and evaluate systems for gaps prior to use. where feasible, regulated companies should design record and data integrity into their processes before purchasing systems and technology. purchased systems should be able to be configured to provide adequate data integrity. the data integrity maturity model described in appendix m2 is a simple representation of the regulated company. it is based on the status of essential elements of effective processes for data integrity.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "35a90664-7150-448e-8233-0b8ad3a23ae3": {"__data__": {"id_": "35a90664-7150-448e-8233-0b8ad3a23ae3", "embedding": null, "metadata": {"page_label": "34", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Unique Entities and Themes\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's digital footprint within a particular storage solution, which is detailed in the provided context.\n\n2. **What are the creation and last modification dates of the \"ISPE Records and Data Integrity Guide\" PDF used in the PharmaWise CSV & Data Integrity project?**\n - The context uniquely specifies the creation and last modification dates of the document, which are specific to this version of the document and its use within a particular project.\n\n3. **Under what title is the \"ISPE Records and Data Integrity Guide\" listed in the PharmaWise Engineer project's raw data directory, and how does it relate to the document's content?**\n - This question seeks to understand the rationale behind the document's title as listed in the project directory (\"Blank Canvas: A Collection of Unique Entities and Themes\"), which may suggest a thematic or organizational approach unique to this collection or project. The answer would require an understanding of the document's content in relation to its listed title, which is specific to this context.\n\nThese questions are tailored to the unique identifiers and metadata provided in the context, which are not likely to be found in general discussions or descriptions of the \"ISPE Records and Data Integrity Guide\" outside this specific project environment.", "prev_section_summary": "The section discusses the importance of data integrity for regulated companies and introduces the Data Integrity Maturity Model. It emphasizes the need for companies to align their processes and systems with data integrity requirements before utilization, and to evaluate systems for gaps. The model suggests designing data integrity into processes before purchasing systems, and ensuring purchased systems can be configured for data integrity. The model, outlined in appendix M2, provides a simple representation of a regulated company's status in terms of essential elements for data integrity.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3e379a98-9ad8-4add-9075-eed3bd912230": {"__data__": {"id_": "3e379a98-9ad8-4add-9075-eed3bd912230", "embedding": null, "metadata": {"page_label": "35", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity Throughout the Data Life Cycle: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. What are the key principles (ALCOA and ALCOA+) that should be maintained throughout the data life cycle to ensure data integrity, as outlined in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest handling variations in life cycle phases for different types of records and data, ensuring their integrity remains uncompromised?\n\n3. What specific controls does the ISPE Records and Data Integrity Guide recommend for different phases of the data life cycle, such as during data creation or the processing and use phases, to maintain data integrity?", "prev_section_summary": "The key topics and entities of the section include the file size, creation and last modification dates, and document title of the \"ISPE Records and Data Integrity Guide\" within the PharmaWise Engineer project. The section provides specific details about the digital footprint and storage location of the document, highlighting its unique identifiers and metadata within the project environment. It also raises questions about the document's title and its relationship to the content, emphasizing the context-specific nature of the information provided.", "excerpt_keywords": "ISPE, Records, Data Integrity, Data Life Cycle, ALCOA+"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### data life cycle\n\n4.1 introduction\n\nthis section describes a generic data life cycle model suitable for all types of data and records. it also describes the activities and requirements for each phase in the data life cycle. variations within the life cycle phases for different record types are also addressed.\n\ndata integrity should be ensured throughout the data life cycle. data integrity controls for data and records should ensure that they remain attributable, legible, contemporaneous, original, and accurate (alcoa) throughout the data life cycle. in addition, data and records should be complete, consistent, enduring and available (alcoa+). see section 1.5.4.\n\nthe data life cycle includes all phases from initial creation of the data, through processing, use, retention and retrieval to eventual destruction, as shown in figure 4.1. the data life cycle also supports instructions (e.g., specifications, procedures and templates) and records/reports (including data and results).\n\ndifferent life cycle phases may need different controls, e.g., second person verification of data during creation or audit trail review during processing and/or use.\n\n|figure 4.1: data life cycle| |\n|---|---|\n|creation|processing|\n|destruction|review; reporting, and use|\n|retention and retrieval| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "03f0f302-eb45-45d5-8b1e-29db9e65287d": {"__data__": {"id_": "03f0f302-eb45-45d5-8b1e-29db9e65287d", "embedding": null, "metadata": {"page_label": "36", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity Throughout the Data Life Cycle: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. What are the recommended practices for ensuring data integrity throughout the data life cycle as outlined in the ISPE GAMP\u00ae Guide: Records and Data Integrity?\n \n2. How does the ISPE GAMP\u00ae Guide suggest handling data transfers across various boundaries and interfaces to maintain data integrity, especially in the context of cloud-based applications and organizational boundaries?\n\n3. According to the ISPE GAMP\u00ae Guide, what specific measures should be taken during the data creation phase to prevent data integrity issues, including the handling of data captured from instruments or measuring systems?", "prev_section_summary": "The section discusses the importance of data integrity throughout the data life cycle, outlining key principles such as ALCOA and ALCOA+. It describes a generic data life cycle model for all types of data and records, addressing variations within life cycle phases for different record types. The section emphasizes the need for controls to maintain data integrity, such as second person verification during data creation and audit trail review during processing and use. The data life cycle includes phases from data creation to destruction, supporting instructions and records/reports.", "excerpt_keywords": "ISPE, GAMP, data integrity, data life cycle, cloud-based applications"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\neach life cycle phase can have an impact on data integrity. robust risk-based business processes should be defined and implemented, and data flows should be understood. this understanding, and these risk-based business processes, should be used to identify, assess, mitigate, and communicate potential data integrity issues throughout the data life cycle.\n\na specific data life cycle should be defined, based on a thorough understanding of the supported business process. quality risk management should be applied throughout the data life cycle. gxp computerized systems supporting the data life cycle and business process should be validated for their intended use, including verification of compliance with regulatory requirements and expectations for data integrity. regulated companies should ensure that their compliance activities include appropriate specification and verification of data integrity requirements and controls. regulated companies should not rely only on supplier qualification packages [13].\n\nfor further details of verification activities see the ispe gamp(r) good practice guide: a risk-based approach to testing of gxp systems [17]. for further details on all aspects of system operation see the ispe gamp(r) good practice guide: a risk-based approach to operation of gxp computerized systems [18].\n\ndata associated with a product or process may cross various boundaries and interfaces throughout the data life cycle [13]. these boundaries may include transfer:\n\n- of data between systems\n- between a manual process and a computerized system\n- to cloud-based applications and storage\n- across organizational boundaries, e.g., between production, qc and qa (internal boundaries)\n- between regulated companies and third parties, e.g., service providers (external boundaries)\n\nrisks associated with such transfers should be considered, and appropriate controls established to prevent loss or modification.\n\n## 4.2 data creation\n\ndata capture or recording should ensure that data of appropriate accuracy, completeness, content, and meaning is collected and retained for its intended use [8]. data integrity may be compromised at the point of creation. if the original data is not reliable, then its integrity cannot be ensured. data creation should provide accurate data needed throughout the business process for decisions based on that data.\n\ndata can be created either by entry of new data, captured by the system from an instrument, device, another system, or captured manually. where data is created by an instrument or measuring system, the instrument should be adequately maintained and calibrated, and should have the range, resolution, linearity, and sensitivity to accurately measure and/or detect the sample attributes under evaluation.\n\ndata should be captured and saved at the time of the activity (contemporaneously), and prior to proceeding to the next activity in the process. where possible, automated data capture techniques should be applied to minimize the risk of data transcription error. appropriately controlled and synchronized clocks should be available for recording timed events [8]. time and date stamps used should be explicit within the context of their use and should be protected from unauthorized change.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0bc9ebc0-1c0b-437a-b4b0-b436ff1868df": {"__data__": {"id_": "0bc9ebc0-1c0b-437a-b4b0-b436ff1868df", "embedding": null, "metadata": {"page_label": "37", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Proper Data Processing in GxP Decision Making: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What are the guidelines for excluding data from GxP decision-making processes or investigations, according to the ISPE GAMP(r) guide, and how should such exclusions be documented?\n \n2. How does the ISPE GAMP(r) guide recommend handling electronic data storage to prevent unauthorized manipulation and ensure data integrity, especially in terms of data attribution and the use of temporary memory?\n\n3. What specific considerations does the ISPE GAMP(r) guide suggest should be included in a risk assessment to determine the rigor of controls and verification required during the data processing step, particularly in relation to product quality and patient safety?", "prev_section_summary": "The section discusses the importance of data integrity throughout the data life cycle as outlined in the ISPE GAMP\u00ae Guide. It emphasizes the need for robust risk-based business processes, understanding data flows, and applying quality risk management. The section also addresses data transfers across various boundaries and interfaces, including between systems, manual processes, cloud-based applications, and organizational boundaries. It highlights the risks associated with data transfers and the need for appropriate controls to prevent loss or modification. Additionally, the section covers data creation, emphasizing the importance of accurate data capture, maintenance and calibration of instruments, contemporaneous data capture, automated data capture techniques, and the use of synchronized clocks for recording timed events.", "excerpt_keywords": "Data Integrity, GxP, ISPE GAMP, Electronic Data Storage, Risk Assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 35\n\n### records and data integrity\n\nall data (even if it has been excluded) should be retained, and be available for review in a format that allows the validity of the decision to exclude the data to be confirmed [8]. if any data is excluded from gxp decision making processes or investigations (e.g., the product release process), there should be a valid, documented, scientific justification for its exclusion [9]. data may be excluded only where it can be demonstrated through robust science that the data is anomalous or non-representative. this justification should be documented and considered during data review and reporting [8].\n\nwhere the capability of the electronic system permits dynamic storage, high resolution or dynamic (electronic) data should be collected in preference to low resolution or static (printed/manual) data [8]. data should not be stored electronically in temporary memory in a manner that allows for manipulation, before being stored safely and securely. data should be attributable to a specific individual, where relevant. the need for verification of data entry, e.g., via second person verification or through technical means such as data validation or barcoding, should be considered based on specific requirements, intended use, and potential impact of data errors within the process.\n\nrisks to data integrity will be influenced by the degree to which data can potentially be manipulated. individuals with a direct interest in the data should not be granted system level privileges (e.g., permitting data deletion, database modification). data should be stored in the predefined location and format. where the same information is recorded concurrently in more than one location or format, the process/data owner should define where the primary record is retained [1]. data should be secured from modification by unauthorized persons and should be changed or deleted only in accordance with regulatory expectations and requirements. any changes should be recorded in the audit trail.\n\n### data processing\n\nduring this phase, data is processed to obtain and present information in the required format. processing should occur in accordance with defined and verified processes (e.g., specified and tested calculations and algorithms), and approved procedures. record/report type documents are typically processed prior to review and reporting. process data should not be manipulated to achieve a more desirable end point. if data processing has been repeated, with iterative modification of processing parameters, this should be made apparent [8]. all data relating to iterative processing runs should be reviewed during routine data verification and stored for the duration of the regulatory retention period.\n\nif data is reprocessed, written procedures should be established and followed. each result should be retained for review, including all data relating to iterative processing runs. for most laboratory analyses, reprocessing data should not be needed on a regular basis. sampling, testing, or processing should not be performed with the goal of achieving a specific result or to overcome an unacceptable result (e.g., testing different samples until the desired passing result is obtained; this practice is sometimes referred to as testing into compliance) [9].\n\nthe impact of data processing on product quality and patient safety will vary by product and business process. the rigor of the controls and verification required for the data processing step should be determined by a documented and justified risk assessment. topics to consider include, where appropriate:\n\n- impact of data on product quality and decision making\n- requirements for independent second person verification by a qualified individual\n- opportunity for data modification or deletion\n- preventing original data from being overwritten or deleted", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "18f6e04d-1430-4ed0-a299-f71d18af1d30": {"__data__": {"id_": "18f6e04d-1430-4ed0-a299-f71d18af1d30", "embedding": null, "metadata": {"page_label": "38", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Review in GAMP\u00ae Guidelines: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide recommend ensuring data integrity during the processing and presentation of data, specifically in terms of user interaction and data exclusion?\n \n2. What specific steps and considerations does the ISPE GAMP\u00ae Guide outline for the review and approval process of data, including the handling of audit trails and atypical data, to ensure compliance with regulatory requirements?\n\n3. According to the ISPE GAMP\u00ae Guide, what are the recommended practices for documenting and justifying excluded data during processing to maintain data integrity and transparency in pharmaceutical development and manufacturing environments?", "prev_section_summary": "The section discusses guidelines for ensuring data integrity in GxP decision-making processes, including the retention of all data, proper documentation of data exclusions, handling of electronic data storage to prevent manipulation, and considerations for data processing. Key topics include the importance of retaining all data, documenting exclusions with valid scientific justifications, storing electronic data securely, preventing unauthorized manipulation, and conducting risk assessments to determine controls and verification required during data processing. Entities mentioned include the ISPE GAMP(r) guide, electronic systems, data processing, data entry verification, data storage, and regulatory expectations for data integrity.", "excerpt_keywords": "ISPE, GAMP, data integrity, electronic systems, audit trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n- preventing data in displays and printouts from being obscured (e.g., by inappropriate annotation)\n- limiting processing to authorized personnel\n- securing processing parameters from unauthorized change\n- logging of parameter changes\n- reporting of data processing errors (either by the system or manually)\n- reconstruction of all data processing activities\n- ensuring traceability of user-defined parameters\n- ability of the user to influence what data is reported (e.g., the user can select what data to print)\n- ability of the user to determine the presentation of data (e.g., adjusting scale/resolution of graphical reports)\n- use of calculations as validity checks\n- use of calculations for the transformation of data\n- exclusion of data during processing should be justified and documented. excluded data should be retained with the original data set and be available for subsequent review [8].\n\n## data review reporting and use\n\nduring this phase data is used for informed decision making. data review, reporting, and use should be performed in accordance with defined and verified processes and approved procedures. data review and reporting are typically concerned with record/report type documents.\n\n### data review\n\ndata review (including second person review as required by regulation) should determine whether predefined specifications, targets, limits, or criteria have been met. the review should be based on a thorough process understanding (and where applicable system understanding) and impact on product quality and/or decision making, and outcomes and conclusion documented.\n\nthe process for the review and approval of data should be described in a procedure. original records subject to review should be defined. reviews should be based upon original data or a true copy (preserving content and meaning). data review should include a review of relevant metadata and gxp data audit trails, where appropriate [8]. audit trails are considered part of the history of associated records. personnel responsible for the review of regulated records should review the audit trails that capture changes to critical data associated with the record. audit trails that capture changes to critical data should be reviewed with each record and before final approval of the record [9].\n\nall data should be considered, recognizing that data may be stored in different locations. this includes atypical, suspect, rejected, invalid, or deleted data, along with any justifications. excluded data should be supported by a documented justification.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d6a029c1-05fa-4dc5-95e3-57f6b947dad8": {"__data__": {"id_": "d6a029c1-05fa-4dc5-95e3-57f6b947dad8", "embedding": null, "metadata": {"page_label": "39", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Audit Trail Review in Regulated Companies: Best Practices and Guidelines\"", "questions_this_excerpt_can_answer": "1. What specific criteria should routine data reviews evaluate to ensure records and data integrity in regulated companies, according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide recommend handling errors or omissions identified during data reviews in the context of regulated companies to maintain compliance with established organizational and technical measures?\n\n3. What are the guidelines provided by the ISPE Records and Data Integrity Guide for conducting audit trail reviews in regulated companies, including the factors that should determine the frequency and rigor of these reviews?", "prev_section_summary": "The section discusses the ISPE GAMP\u00ae Guide recommendations for ensuring data integrity in processing and presentation of data, including preventing data obscuring, limiting processing to authorized personnel, securing processing parameters, and documenting parameter changes. It also covers the importance of data review, reporting, and use in decision making, emphasizing the need for thorough process understanding, review of metadata and audit trails, and justification for excluded data. The section highlights the significance of data integrity, transparency, and compliance with regulatory requirements in pharmaceutical development and manufacturing environments.", "excerpt_keywords": "data integrity, audit trail, regulated companies, ISPE Records, review procedures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nroutine data review should evaluate [8]:\n\n- the integrity of an individual data set\n- compliance with established organizational and technical measures\n- any data risk indicators (e.g., data amendment, or orphan data)\n\ndata review procedures should cover:\n\n- method for review\n- method for approval, e.g., by use of an electronic signature\n- the meaning of review and approval signatures to ensure persons understand their responsibilities\n- requirements for review by quality assurance (e.g., under us gmps, any data created as part of a cgmp record must be evaluated by the quality unit as part of release criteria) [9]\n- handling errors or omissions identified during data review\n- managing data corrections or clarifications\n- managing atypical, erroneous, or invalid results\n\nsecond person reviews should focus on the overall process from data creation to calculation of reportable results. such reviews may cross system boundaries as well as the associated external records and may include verification of any calculations used.\n\nreview by exception should be based on validated data processing routines that cannot be influenced by the user. periodic review or audit of data governance measures should assess effectiveness of established organizational and technical measures, and should also consider the possibility of unauthorized activity [8]. where data reviews are conducted by a third party the regulated company should ensure respective roles and responsibilities are documented and agreed by both parties.\n\n## audit trail review\n\nregulated companies should establish a documented process for review of audit trails, including as a part of second person review. these reviews should form part of the routine data review/approval process and are usually performed by the operational area which has generated the data (e.g., clinical, laboratory, manufacturing). the requirement for audit trail review, including the frequency and rigor and roles and responsibilities, should be based on a documented risk assessment taking into account the business process and criticality of the data, the complexity of the system and its intended use, and the potential impact on product quality and patient safety. audit trail reviews should be performed by an individual who has an understanding of the business process and the impact of the actions recorded. they are an effective means of verifying that changes are made by authorized users and for detecting potential data integrity issues. for more information on review of audit trails, see appendix m4.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "905b1b3e-0b50-4c24-8d13-cbcc85d03d6d": {"__data__": {"id_": "905b1b3e-0b50-4c24-8d13-cbcc85d03d6d", "embedding": null, "metadata": {"page_label": "40", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Security in Reporting and Distribution Processes: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. What specific procedures should be implemented to ensure the consistency and integrity of data reporting within the framework of the ISPE GAMP(r) guide, particularly in relation to pharmaceutical engineering and data integrity practices?\n\n2. How does the ISPE GAMP(r) guide address the challenges of data manipulation and the importance of problem identification and resolution in the context of pharmaceutical data reporting and decision-making processes?\n\n3. What are the recommended practices for data distribution according to the ISPE GAMP(r) guide, including the management of version control and the verification of system interfaces, to support regulated activities in the pharmaceutical industry?", "prev_section_summary": "The section discusses the importance of ensuring records and data integrity in regulated companies, as outlined in the ISPE Records and Data Integrity Guide. Key topics include routine data reviews evaluating data integrity, compliance with organizational and technical measures, and handling errors or omissions. The section also covers guidelines for conducting audit trail reviews, including factors determining frequency and rigor, roles and responsibilities, and the importance of understanding the business process and impact of actions recorded. Overall, the section emphasizes the need for effective data review and audit trail processes to maintain data integrity and compliance in regulated companies.", "excerpt_keywords": "ISPE, GAMP, data reporting, data integrity, pharmaceutical engineering"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide:\n\n### 4.4.3 data reporting\n\ndata reporting procedures should ensure pe consistency and integrity of results. the procedures should:\n- define what data is to be included in pe data set used for reporting pe results, i.e., pe complete data set\n- address report layout and formatting requirements\n- describe use of reports for gxp decisions\n- address report, review, and approval requirements\n- ensure all data is included in pe dataset, unless pere is documented justification for excluding it\n- address manual data entry\n- address handling and investigation of atypical results\n- address suspect, rejected, invalid, or deleted data\n- describe how corrective and preventative actions are established for invalid runs, failures, repeats, and oper atypical data\n- consider controls to prevent and detect data manipulation\n- encourage problem identification and solving\n\nparticular attention is required when users are able to influence the reporting of data, e.g., testing into compliance should be avoided. poorly designed or defined processes, test methods, and reports can create opportunities for data reporting errors, and the potential for erroneous decisions.\n\nsummary reports are limited as they may not contain all data and there is a risk that data issues may not be included, and therefore, not reviewed. where data summaries are used for reporting, there should be documented verification of these summaries in accordance with original data.\n\ntrending and appropriate metrics can be used to identify and investigate potential data integrity issues.\n\n### 4.4.4 data distribution\n\ndata should be accessed by, and distributed to, authorized individuals and other systems supporting the business process. interfaces between business process systems should be designed and verified to report failures, prevent data loss, and enable recovery.\n\nversion control should be applied to ensure that personnel have access to the appropriate and up to date versions required in order to perform regulated activities. instruction type documents should have clear effective dates. the need for formal confirmation of data receipt should be considered.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "301e6f30-7d78-4102-b750-cabe09bd3c62": {"__data__": {"id_": "301e6f30-7d78-4102-b750-cabe09bd3c62", "embedding": null, "metadata": {"page_label": "41", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Secure Data Retention and Retrieval Compliance in Regulated Environments: Best Practices and Guidelines\"", "questions_this_excerpt_can_answer": "1. What specific considerations should regulated companies take into account when retaining and retrieving data to ensure compliance with GxP content and meaning, according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide recommend handling the verification and disposal of original records when they are converted to electronic format for retention purposes in regulated environments?\n\n3. What are the guidelines provided by the ISPE Records and Data Integrity Guide for ensuring the security and integrity of data and associated metadata during changes to system hardware or software in a regulated company's data retention and retrieval processes?", "prev_section_summary": "The section discusses the importance of data reporting and distribution procedures in ensuring data integrity and security within the framework of the ISPE GAMP(r) guide. Key topics include defining data sets for reporting, addressing report layout and formatting requirements, handling atypical results, preventing data manipulation, and encouraging problem identification and resolution. The section also emphasizes the need for authorized access to data, verification of system interfaces, version control, and clear effective dates for instruction documents to support regulated activities in the pharmaceutical industry.", "excerpt_keywords": "ISPE, Records, Data Integrity, Compliance, GxP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### data retention and retrieval\n\nduring this phase, data should be retained securely. data should be readily available through the defined retention period in accordance with defined and verified processes and approved procedures.\n\napplicable regulatory requirements, other laws and legislation, and internal regulated company policies should be met. this includes retention and privacy requirements. regulated companies should be aware that legislation such as local laws may have specific requirements to be met (e.g., blood products have different retention periods in the us, europe, and japan). legal admissibility of information stored electronically should also be considered.\n\ndata, including all the original data and associated metadata required to maintain gxp content and meaning, should be defined and stored in a secure location that has adequate physical and electronic protection against deliberate or inadvertent alteration or loss throughout the retention period. access should be limited to authorized persons and adequate environmental protection from damage should be provided (e.g., from water and fire).\n\ndata, and relevant associated metadata, should be readily traceable and accessible throughout the retention period. the relationships between data and associated metadata should be preserved securely to support future queries or investigations, including reconstruction of gxp activities. data and associated metadata may reside in separate locations and possibly on different media.\n\nretrieval of records and copies of records made (including those made for archival purposes), should preserve the content and meaning of the record, and continue to meet relevant gxp regulatory requirements.\n\npaper records may be retained electronically, e.g., by scanning, provided the copy is verified as a true copy, following an established validated process. regulated companies may discard the original record once a verified true copy (preserving content and meaning) has been made, e.g., where the original record is not permanent.\n\nfollowing changes to the system (hardware or software) the regulated company should verify that the retained data can still be accurately retrieved.\n\nchanges to data should be amended only by authorized persons in accordance with an approved process. a record of all changes should be retained indicating when and how the original data was changed. there should be controls in place to ensure that previous versions of data are not inadvertently restored or otherwise made available.\n\ndata retention and retrieval procedures should consider the following, where relevant:\n\n- address data synchronization where the system architecture involves storing data on multiple servers\n- disaster recovery, including backup and restore\n- durability of media used for storage\n- use of encryption to enhance security (dependent on criticality of data and risks to that data)\n- environmental controls\n\nfor further details on operational processes and controls see ispe gamp(r) good practice guide: a risk-based approach to operation of gxp computerized systems [18].\n\nwhere a third party is involved in regulated data retention and retrieval activities, the regulated company should ensure that a documented assessment of the third party and the risks is undertaken. this should demonstrate that adequate controls are in place and residual risks are understood and accepted.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "517f5f8b-5593-42ea-bc0a-d569113cebf9": {"__data__": {"id_": "517f5f8b-5593-42ea-bc0a-d569113cebf9", "embedding": null, "metadata": {"page_label": "42", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Data Retention and Migration in Regulated Companies", "questions_this_excerpt_can_answer": "1. What specific considerations should regulated companies take into account when deciding how to retain and retrieve regulated data from systems that are being retired or can no longer be supported?\n \n2. How does the ISPE GAMP(r) guide suggest regulated companies handle the migration of metadata, including the audit trail, during data migration processes?\n\n3. According to the ISPE GAMP(r) guide, under what conditions can original records be deleted in regulated companies, and what procedural steps must be followed to ensure compliance with GXP regulations?", "prev_section_summary": "The section discusses the importance of data retention and retrieval in regulated environments, emphasizing the secure storage and accessibility of data throughout the defined retention period. It highlights the need to meet regulatory requirements, laws, and internal company policies, as well as considerations for legal admissibility of electronically stored information. The guidelines cover the secure storage of data and associated metadata, traceability, preservation of content and meaning during retrieval, verification of electronic records, and controls for data amendments. Additionally, it addresses data synchronization, disaster recovery, media durability, encryption, and environmental controls. The section also mentions the involvement of third parties in data retention and retrieval activities, emphasizing the need for documented assessments and adequate controls.", "excerpt_keywords": "ISPE, GAMP, data retention, migration, regulated companies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide:\n\na formal agreement should be established defining the respective roles and responsibilities and covering:\n\n- procedures to follow\n- physical location\n- applicable regulations and laws and their impact\n- monitoring\n- incident management\n- business continuity\n- backup\n- data destruction\n\nthe same retention and retrieval requirements apply irrespective of whether the data is electronic or paper based.\n\nwhen a system is retired, or can no longer be supported, regulated companies should consider how regulated data will continue to be retained and retrieved throughout the remainder of the retention period, considering:\n\n- criticality of data\n- practicality of maintaining existing software (e.g., in a virtual environment)\n- options for migrating the data to another system or to an alternative file format, while retaining the original content and meaning\n- risk of retaining the information without metadata, e.g., retaining only reports written to pdf or paper\n\nregulated companies may choose to retain records in formats other than the original, provided content and meaning is preserved, and gxp regulations are met. the ability to retain records in a format that ensures the dynamic nature of the data throughout the retention period is not always possible or cost effective, due to the difficulty of migrating the data over time. decisions should consider the balance between the need for long term accessibility and the level of ongoing retrieval functionality required (e.g., need for dynamic searching, trending, reprocessing). original records can only be deleted if gxp regulations are fully satisfied and the content and meaning of the records are preserved, following a defined procedure including management authorization.\n\nmigration should be based on a defined process, including a documented risk assessment, and managed within the framework of a data migration plan and report. see ispe gamp(r) 5 [3].\n\nwhen migrating data, regulated companies should make an informed risk-based decision regarding the migration of metadata, including the audit trail along with the data. this decision should be based upon business requirements and regulatory expectations. if the audit trail is integral to understanding the data, it should be maintained as part of the migrated data. a decision not to migrate an audit trail should be justified based on risk, and documented.\n\nwhere legacy systems are retained for data retention and retrieval, the regulated company should periodically verify that data can still be retrieved.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "19e5d56b-dd21-4abb-8bf2-f67e8778b87b": {"__data__": {"id_": "19e5d56b-dd21-4abb-8bf2-f67e8778b87b", "embedding": null, "metadata": {"page_label": "43", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Best Practices for Ensuring Data Integrity: Backup, Restore, and Archiving Processes\"", "questions_this_excerpt_can_answer": "1. What specific measures should a regulated company take to ensure the integrity of backup processes for GxP relevant data, including associated metadata, to prevent data loss or corruption?\n \n2. How should a regulated company approach the archiving of data and associated metadata to ensure long-term, permanent retention for review or investigation purposes, including the ability to reconstruct a process or activity?\n\n3. What are the recommended practices for verifying the process and technology for data restoration, particularly in terms of accessibility, readability, and accuracy, as outlined in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses best practices for data retention and migration in regulated companies, as outlined in the ISPE GAMP(r) guide. Key topics include establishing formal agreements for data retention and retrieval, considerations for retaining and migrating regulated data from retired systems, handling metadata and audit trails during data migration processes, and the conditions under which original records can be deleted in compliance with GXP regulations. The importance of preserving the content and meaning of records, making informed risk-based decisions during data migration, and periodically verifying data retrieval from legacy systems are also highlighted.", "excerpt_keywords": "Records, Data Integrity, Backup, Restore, Archiving, Metadata"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### backup and restore\n\nregular backups of all relevant data, including associated metadata required to maintain gxp content and meaning, should be performed in accordance with a documented process in order to allow recovery in the case of system failure, data corruption, or loss. the regulated company should ensure that the backup process is designed such that regulated data is not lost or corrupted.\n\ndata backup should have controls commensurate with those controls for the original data to prevent unauthorized access, modification, or deletion. backups should be held in a physically separate and secure location. the process and technology for restoring data should be based on the criticality of the data and the required restoration time. the process for backup and the ability to restore data should be verified before use and monitored periodically for accessibility, readability, and accuracy through the retention period.\n\nwhere relevant metadata is stored separately to the regulated data (e.g., in audit trails or separate files), it should be ensured that all required data is available and can be successfully restored.\n\n### archiving\n\narchiving involves the long term, permanent retention of data and associated metadata for the purposes of review or investigation throughout the retention period, which may include the reconstruction of a process or activity [1]. archiving should be performed in accordance with defined and verified processes and approved procedures, that meet the general requirements for retention and retrieval. see section 4.5.1.\n\nthese procedures should consider where appropriate:\n\n- determination of storage media life expectancy\n- multiple copies\n- management of stored media\n- indexing of stored records\n- impact of system upgrade on stored records\n- impact of changing technology on stored records\n- ability to reprocess data where required\n\narchived data should be periodically checked for accessibility and readability. for more information on archiving of records, see appendix o1.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "59d8196f-63ef-4eb5-950d-9fc453590c4c": {"__data__": {"id_": "59d8196f-63ef-4eb5-950d-9fc453590c4c", "embedding": null, "metadata": {"page_label": "44", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Destruction and Retention Compliance in ISPE GAMP(r) Guide: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific considerations should be taken into account when establishing a procedure for the destruction of data according to the ISPE GAMP(r) Guide's recommendations on records and data integrity?\n\n2. How does the ISPE GAMP(r) Guide suggest handling the conflict between different jurisdictional laws regarding data retention and the disposal of data in the context of records and data integrity?\n\n3. What measures does the ISPE GAMP(r) Guide recommend to ensure that data is not inadvertently disposed of before the end of its required retention period or while it is under a litigation hold?", "prev_section_summary": "This section discusses the importance of ensuring data integrity through backup, restore, and archiving processes in regulated companies. Key topics include the need for regular backups of data and associated metadata, controls to prevent unauthorized access or modification, secure storage of backups, verification of data restoration processes, and long-term archiving of data for review or investigation purposes. Entities mentioned include regulated companies, data backup processes, technology for data restoration, metadata storage, archiving procedures, storage media life expectancy, management of stored media, and periodic checks for accessibility and readability of archived data.", "excerpt_keywords": "ISPE, GAMP, data integrity, data destruction, retention compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n4.6 data destruction\n\nthe data destruction phase involves ensuring that the correct original data is disposed of after the required retention period in accordance with a defined process and approved procedures. the procedure should consider:\n\n- retention requirements\n- method of disposal, ensuring deletion from all systems and physical locations\n- conditions of disposal\n- measures to prevent inadvertent disposal of data that is still required\n- documentation required to demonstrate timely disposal (for business reasons)\n- limiting disposal functionality to a restricted number of responsible individuals\n\ndata disposal should account for all local legislative retention requirements and for all locations that reference the data. retention requirements may differ by jurisdiction, or there may be a litigation hold on some data in some countries/regions. there may also be conflict between applicable laws.\n\ndata disposal should not occur without verification that the regulated record is not in a hold status to support litigation. distributed copies should also be destroyed when a record is disposed. data should not normally be retained beyond the defined retention period.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "40caba99-fe3c-4fc7-9ac6-673cf1db30d6": {"__data__": {"id_": "40caba99-fe3c-4fc7-9ac6-673cf1db30d6", "embedding": null, "metadata": {"page_label": "45", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Title: Quality Risk Management for Records and Data Integrity in [Specific Industry/Field]", "questions_this_excerpt_can_answer": "1. What is the role of quality risk management in the lifecycle of computerized systems and data, according to the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest conducting a process risk assessment for identifying high-level risks to patient safety, product quality, and data integrity?\n\n3. Can you detail the five-step approach for quality risk management as described by ISPE GAMP\u00ae 5 and its relation to ICH Q9 within the context of managing risks associated with records and data integrity?", "prev_section_summary": "The section discusses data destruction in accordance with the ISPE GAMP(r) Guide's recommendations on records and data integrity. Key topics include retention requirements, method of disposal, conditions of disposal, measures to prevent inadvertent disposal, documentation requirements, and limiting disposal functionality to responsible individuals. The section also addresses the handling of conflicting jurisdictional laws regarding data retention and disposal, as well as measures to ensure data is not disposed of prematurely or while under litigation hold. Key entities mentioned include the ISPE GAMP(r) Guide, responsible individuals, and regulated records.", "excerpt_keywords": "ISPE, Records, Data Integrity, Quality Risk Management, GAMP"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n5 quality risk management\n\n5.1 introduction\n\nquality risk management is a systematic process for the assessment, control, communication, and review of risks. it is an iterative process used throughout the entire computerized system life cycle from concept to retirement, and throughout the data life cycle from creation to destruction. risks to data and record integrity should be identified and managed along with other quality and safety risks by adopting a risk management approach based on an understanding of the process. the evaluation of the risk to quality should be based on scientific knowledge and ultimately linked to the protection of the patient. the level of effort, formality, and documentation of the quality risk management process should be commensurate with the level of risk. some records and data may reside on more than one system during their life cycle, and quality risk management activities should start at the business process level, at a level higher than individual systems.\n\n5.2 process risk assessment\n\na process risk assessment (also known as business process risk assessment) is a non-system specific high level assessment at the business process or data flow, which may occur before system-specific quality risk management activities. an equivalent risk assessment from a data flow (rather than business process flow) perspective may be performed, using the same approaches and techniques, and with the same benefits. the process risk assessment is aimed at identifying key high level risks to patient safety, product quality and data integrity, and identifying the required controls to manage those risks. typically, at this stage no assumptions are made about the nature or exact functionality and design of the computerized system(s) that will support the process. the process risk assessment provides valuable input to subsequent quality risk management activities. typical inputs to the process risk assessment include:\n\n- defined business process scope\n- process descriptions and/or diagrams\n- identified regulatory requirements for the proposed process scope\n- identified company quality requirements\n\n5.3 quality risk management approach\n\nthe activities required to manage the specific risks associated with records and data should form part of the normal individual gxp computerized system life cycle. decisions on the extent of validation and data integrity controls should be based on a justified and documented risk assessment [7]. ispe gamp(r) 5 [3] describes a five-step approach for quality risk management, based on ich q9 [10].", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f720f921-b093-480d-a2c4-584c3c83be61": {"__data__": {"id_": "f720f921-b093-480d-a2c4-584c3c83be61", "embedding": null, "metadata": {"page_label": "46", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Quality Risk Management Approach for Ensuring Patient Safety, Product Quality, and Data Integrity", "questions_this_excerpt_can_answer": "1. What is the initial step in the quality risk management approach outlined in the ISPE Records and Data Integrity Guide, and what does it entail in terms of system impact assessment?\n \n2. How does the guide recommend identifying functions that impact patient safety, product quality, and data integrity during the development and review of user requirements and system specifications?\n\n3. According to the guide, how should risks associated with standalone records, not linked to any specific computerized system, be managed within the context of quality risk management and data integrity?", "prev_section_summary": "The section discusses the importance of quality risk management in ensuring records and data integrity in computerized systems. It emphasizes the systematic process of assessing, controlling, communicating, and reviewing risks throughout the system and data lifecycle. The section outlines the process risk assessment, which identifies high-level risks to patient safety, product quality, and data integrity at the business process level. It also details the five-step approach for quality risk management as described by ISPE GAMP\u00ae 5 and its relation to ICH Q9 in managing risks associated with records and data integrity. The key entities mentioned include the evaluation of risk to quality based on scientific knowledge, the importance of starting quality risk management activities at the business process level, and the need for justified and documented risk assessments to determine validation and data integrity controls.", "excerpt_keywords": "Quality Risk Management, Data Integrity, Patient Safety, Product Quality, ISPE Guide"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## figure 5.1: quality risk management approach\n\n|step 1|perform initial risk assessment and determine system impact|\n|---|---|\n|step 2|identify functions with impact on patient safety, product quality, and data integrity|\n|step 3|perform functional risk assessments and identify controls|\n|step 4|implement and verify appropriate controls|\n|step 5|review risks and monitor controls|\n\nsome records may be stand alone, maintained on a file share, and may not be associated with any one specific computerized system. the risks associated with such records should be considered and addressed, typically as part of process (or data flow) risk assessment.\n\nstep 1 - perform initial risk assessment and determine system impact\n\nan initial risk assessment should be performed based on an understanding of business processes and process risk assessments, user requirements, regulatory requirements, and known functional areas. this should include initial identification of important data, records, and signatures.\n\nstep 2 - identify functions with impact on patient safety, product quality, and data integrity\n\nfunctions which have an impact on patient safety, product quality, and data integrity should be identified by building on information gathered previously, referring to relevant specifications, and taking into account project approach, system architecture, and categorization of system components.\n\nbased on defined business processes, during development of user requirements and during subsequent functional, configuration, or design specifications, the associated data, records, and signatures should be defined and documented.\n\nthis activity typically begins in step 1, and continues iteratively through the data life cycle. a data flow analysis is useful in supporting this activity and in determining the role of each item in regulated processes. the primary record should be identified. the emphasis should be on identifying data and records at the regulated process level, rather than physical database records, or table fields.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f3a6acf7-e14c-4729-bb46-c6e69c152f01": {"__data__": {"id_": "f3a6acf7-e14c-4729-bb46-c6e69c152f01", "embedding": null, "metadata": {"page_label": "47", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Title: Data Integrity Risk Management in Regulated Environments: Strategies and Best Practices", "questions_this_excerpt_can_answer": "1. What specific steps are recommended by the ISPE GAMP\u00ae Guide for performing functional risk assessments to ensure records and data integrity in regulated environments?\n\n2. How does the ISPE Records and Data Integrity Guide suggest addressing potential hazards to data, records, and signature integrity during the risk assessment process?\n\n3. What are the key considerations outlined in the document for reviewing risks and monitoring controls related to data integrity, specifically in the context of regulated companies' mechanisms and programs?", "prev_section_summary": "The section discusses the quality risk management approach outlined in the ISPE Records and Data Integrity Guide, focusing on steps such as initial risk assessment, identifying functions impacting patient safety and product quality, implementing controls, and monitoring risks. It also addresses the management of risks associated with standalone records not linked to specific computerized systems. Key topics include performing risk assessments, identifying critical functions, defining data and records, and emphasizing regulated process level data over physical database records. Key entities mentioned are initial risk assessment, system impact, functions impacting patient safety and product quality, data integrity, user requirements, regulatory requirements, and data flow analysis.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Risk Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 45\n\nrecords and data integrity\n\nstep 3 - perform functional risk assessments and identify controls\n\nfunctions identified during step 2 should be assessed by considering possible hazards, and how the potential harm arising from these hazards may be controlled. it may be necessary to perform a more detailed assessment that analyzes further the severity of harm, likelihood of occurrence, and probability of detection (see ispe gamp(r) 5 [3] for an example detailed assessment process).\n\npotential hazards to data, records, and signature integrity should be specifically addressed as an integral part of these assessments. required data integrity controls should be identified and documented. it is desirable to eliminate risk by modifying processes or system design, where possible. if risk cannot be eliminated, it should be reduced to an acceptable level by appropriate controls.\n\ncontrols may be behavioral, procedural, or technical in nature. technical controls should be included in the relevant specifications (e.g., user requirements specifications or functional specifications), and identified procedures should be developed for the system. verification of the installation and correct operation of technical controls (and some procedural controls) should occur during testing. behavioral controls are general, i.e., not specific to a single system, and should be part of a wider data governance framework. see appendix d3.\n\na suitable cross-functional team should be involved in risk assessments, including the process owner, data owners, and other functions as necessary including qu, it, and engineering.\n\nstep 4 - implement and verify appropriate controls\n\nthe control measures identified in step 3 should be implemented and verified to ensure that they have been successfully implemented. controls should be traceable to the relevant identified risks. the verification activity should demonstrate that the controls are effective in performing the required risk reduction.\n\nstep 5 - review risks and monitor controls\n\nregulated companies may have different mechanisms and programs in place to review and assess the effectiveness of data integrity controls. these include periodic review of system, user access reviews, it security audits, data audits, and qa audits.\n\nperiodic reviews may address only a subset of data integrity controls, as they are typically focused on validation and system-based aspects and not organizational, data, and process level aspects.\n\n### product and process context\n\neffective quality risk management depends on knowledge of the regulated product and process. judgment should be driven by an overall risk assessment of the business or facility aimed at identifying the overall risks to product quality or public safety that may occur due to data integrity problems. different facilities and products will have different risk profiles.\n\nproduct and process risk considerations include:\n\n- misinterpretation of product quality, safety, or efficacy\n- adulteration of product\n- release of adulterated or quarantined product\n- misbranding of product", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bd12b854-bba2-4ad5-bf1a-b3253b168096": {"__data__": {"id_": "bd12b854-bba2-4ad5-bf1a-b3253b168096", "embedding": null, "metadata": {"page_label": "48", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Mitigating Risks and Vulnerabilities in Data Integrity Controls: Strategies for Ensuring Data Security and Accuracy", "questions_this_excerpt_can_answer": "1. What are some specific examples of hazards that can impact data integrity in the pharmaceutical industry, as outlined in the ISPE GAMP\u00ae Guide: Records and Data Integrity?\n\n2. How does the ISPE GAMP\u00ae Guide suggest pharmaceutical companies should prioritize their efforts in mitigating risks to data integrity?\n\n3. According to the ISPE GAMP\u00ae Guide, what factors should be considered when determining the appropriate level of control for mitigating risks and vulnerabilities in data integrity controls within the pharmaceutical sector?", "prev_section_summary": "The section discusses the importance of performing functional risk assessments to ensure records and data integrity in regulated environments, as recommended by the ISPE GAMP\u00ae Guide. It highlights the need to identify and address potential hazards to data, records, and signature integrity during the risk assessment process. The document outlines key considerations for reviewing risks and monitoring controls related to data integrity, emphasizing the involvement of a cross-functional team in risk assessments and the implementation and verification of appropriate controls. It also mentions the importance of periodic reviews and various mechanisms and programs that regulated companies may have in place to assess the effectiveness of data integrity controls. Additionally, the section emphasizes the importance of understanding the product and process context in quality risk management, including considerations such as misinterpretation of product quality, safety, or efficacy, adulteration of product, and misbranding of product.", "excerpt_keywords": "ISPE, GAMP, data integrity, pharmaceutical industry, risk assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n* inability to recall product\n\n* incorrect product quality or patient safety decisions, e.g., impact on preclinical or clinical safety results, adverse drug reactions (adr) and adverse events (ae).\n\n* incorrect submission to a regulatory agency\n\nrisks of such outcomes, the existence of other mitigating controls, as well as the vulnerability of the data to loss or corruption, and the actual physical and technical environment, should be considered when identifying the appropriate and commensurate level of control.\n\nhazards and vulnerabilities should be identified and documented as part of the risk assessment process. examples of hazards potentially impacting data integrity include:\n\n* data falsification due to storing data electronically in temporary memory in a manner that allows for manipulation\n\n* loss or corruption of data due to system interface errors\n\n* manual transcription or data entry errors\n\n* unauthorized approvals due to uncontrolled access\n\n* data corruption due to information security failures (e.g., firewall breaches or malware attacks)\n\n* processing failures due to software or configuration error\n\n* loss of data availability due to environmental problems or hardware failures\n\neffort should be focused on hazards that are specific to the process, type of data, use of data, or type of system, as this will help identify required controls beyond the basic controls, aimed at routine factors, that are typically already in place.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e037f0bc-96f7-472a-97db-c9b41f14e7b6": {"__data__": {"id_": "e037f0bc-96f7-472a-97db-c9b41f14e7b6", "embedding": null, "metadata": {"page_label": "49", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Corporate Data Integrity Program Implementation and Requirements Guide", "questions_this_excerpt_can_answer": "1. What are the key components that should be included in the development of a high-level strategy for a corporate data integrity program according to the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest measuring the performance and progress of a corporate data integrity program?\n\n3. What does the MHRA GMP Data Integrity Definition and Guidance for Industry (March 2015) state about the relationship between data integrity and the pharmaceutical quality system, as referenced in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the risks and vulnerabilities in data integrity within the pharmaceutical industry as outlined in the ISPE GAMP\u00ae Guide. It highlights specific hazards that can impact data integrity, such as data falsification, data loss, manual errors, unauthorized access, and system failures. The guide suggests prioritizing efforts in mitigating risks based on the specific process, type of data, use of data, or type of system. Factors to consider when determining the appropriate level of control include the risks of outcomes, mitigating controls, vulnerability of data, and the physical and technical environment. The importance of identifying and documenting hazards and vulnerabilities as part of the risk assessment process is emphasized.", "excerpt_keywords": "ISPE, Records, Data Integrity, Corporate, Program"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 47\n\n## records and data integrity appendix m1\n\n### appendix m1 - corporate data integrity program\n\n6.1 introduction\n\nregulated companies should consider implementing a corporate data integrity program to identify, remediate, and manage potential risks to data integrity. this appendix is intended to provide regulated companies with a direction for creating a successful corporate data integrity program [19].\n\nregulated companies should ensure that they appropriately address record and data integrity and data governance. organizational, procedural, and technical controls should also be considered as part of data governance. the effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality assurance resource demands [1].\n\nkey implementation considerations for a corporate data integrity program include development of a high-level strategy which:\n\n- includes a documented rationale\n- defines the executive sponsorship and governance process\n- focuses on management accountability\n- defines and implements tools for knowledge sharing\n- develops and provides the appropriate levels of training\n\nthe corporate data integrity program should address behavioral factors and drive a strategy that focuses on prevention, detection, and response. business processes, systems, equipment, and personnel continue to evolve and change. corporate data integrity programs should include a plan for continuous improvement, which includes:\n\n- appropriate metrics to measure performance\n- program reporting to communicate progress\n- appropriate audit and assessment processes to identify issues and measure progress and ongoing compliance\n\n6.2 is a corporate data integrity program required?\n\nthe mhra gmp data integrity definition and guidance for industry (march 2015) [1] discusses the need and importance of focusing on data integrity as part of a corporate program. it states that:\n\n\"data integrity is fundamental in a pharmaceutical quality system which ensures that medicines are of the required quality.\"\n\nthe mhra[1] goes on to state that:\n\n\"the data governance system should be integral to the pharmaceutical quality system ...\"", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "fd9d8f11-c7af-4d16-a930-e1bb66ac67bc": {"__data__": {"id_": "fd9d8f11-c7af-4d16-a930-e1bb66ac67bc", "embedding": null, "metadata": {"page_label": "50", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Data Integrity and Governance Compliance in Regulated Companies: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. How does the MHRA guidance for industry suggest regulated companies should balance their efforts and resources between data governance and other quality assurance demands?\n \n2. What are the key considerations regulated companies should evaluate to determine if their Quality Management System (QMS) adequately addresses data integrity requirements according to the ISPE GAMP\u00ae Guide: Appendix M1?\n\n3. What role does management accountability play in ensuring data integrity within a regulated company, and what specific actions should management take to foster an environment that promotes good data integrity practices as outlined in the \"Data Integrity and Governance Compliance in Regulated Companies: Best Practices and Strategies\" document?", "prev_section_summary": "The section discusses the importance of implementing a corporate data integrity program in regulated companies to identify, remediate, and manage potential risks to data integrity. Key topics include the development of a high-level strategy for the program, executive sponsorship, governance processes, management accountability, knowledge sharing tools, training, behavioral factors, prevention, detection, response strategies, continuous improvement, metrics for measuring performance, program reporting, audit processes, and ongoing compliance. The section also references the MHRA GMP Data Integrity Definition and Guidance for Industry, emphasizing the integral relationship between data integrity and the pharmaceutical quality system. Key entities mentioned include regulated companies, the MHRA, and the pharmaceutical quality system.", "excerpt_keywords": "ISPE, GAMP, data integrity, governance, regulated companies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m1 records and data integrity\n\nthese two statements reinforce the expectation that regulated companies should address data integrity and data governance in their qms, because it is fundamental to ensuring product quality. the mhra guidance for industry further states that:\n\n\"the effort and resource assigned to data governance should be commensurate with the risk to product quality, and should also be balanced with other quality assurance resource demands.\"\n\nthe emphasis is on designing and implementing a quality and data governance program that provides an acceptable state of control based on the risk to data integrity. assessment activities can serve as a good basis for defining and establishing a corporate data integrity program strategy, as discussed in section 6.4.\n\n### 6.3 indicators of program scope and effort\n\nin order to design and implement an appropriate corporate data integrity program, regulated companies should first understand their current state and acceptability of control based on risk to data integrity.\n\ndata integrity and data governance should be an integral part of the qms. focusing on the organizational/procedural controls is an appropriate place to start.\n\nregulated companies should understand whether data integrity requirements are adequately addressed within the qms.\n\nperforming a review of the qms versus data integrity requirements can identify where procedural controls may need to be addressed. considerations include:\n\n- do adequate processes exist within the qms to prevent, detect, report, and address data integrity failures?\n- are the alcoa+ requirements clearly addressed within the qms?\n- are there adequately defined processes for generating and reviewing data?\n- are there adequate controls for the entire lifecycle of data?\n\nin a well-defined corporate qms aligned with gxp regulations, most of these items should be addressed and traceable to the regulations applicable to the business processes; however, a more detailed gap assessment may be required to understand the state of data integrity controls in place at the local level. organizational gaps are more likely to be identified as sites and local business areas define and execute their local procedures.\n\nthe corporate and quality culture can impact the level of data integrity within a regulated company and should also be assessed and understood, e.g.:\n\n- is there appropriate knowledge and accountability for data integrity requirements and expectations at the operational level, as these are the personnel who typically generate and manage the data used to support product quality?\n\nmanagement accountability, at all levels of the corporation, should play a key role in ensuring data integrity.\n\nmanagement should set an example and foster an environment that promotes and ensures good data integrity practices, including:\n\n- technical controls, which include equipment and computer systems, should be assessed to establish whether systems are adequately qualified and/or validated to ensure data integrity", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "87c458f9-096e-4a4f-83ee-3b95190fd058": {"__data__": {"id_": "87c458f9-096e-4a4f-83ee-3b95190fd058", "embedding": null, "metadata": {"page_label": "51", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity: Controls, Self-Assessment, and Regulatory Inspection in Data Management", "questions_this_excerpt_can_answer": "1. What specific measures does the ISPE Records and Data Integrity Guide recommend for ensuring the integrity of data throughout its life cycle, particularly in terms of system access, security, and audit trails?\n \n2. How does the guide propose organizations should conduct self-assessments and audits to monitor compliance with QMS and regulatory requirements, and what outcomes should these reviews aim to achieve in terms of identifying data integrity issues?\n\n3. According to the guide, how can regulatory inspection findings be used to assess the control of risk to data integrity, and what implications do these findings have for identifying and addressing systemic issues within a regulated company?", "prev_section_summary": "The section discusses the importance of data integrity and data governance in regulated companies, as outlined in the ISPE GAMP\u00ae Guide: Appendix M1. It emphasizes the need for a quality and data governance program that provides an acceptable state of control based on the risk to data integrity. The section also highlights indicators of program scope and effort, such as understanding the current state of data integrity controls within the Quality Management System (QMS) and assessing the corporate and quality culture. Management accountability is identified as crucial in ensuring data integrity, with specific actions recommended to promote good data integrity practices.", "excerpt_keywords": "Data Integrity, ISPE, Records, Self-Assessment, Regulatory Inspection"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n- system access and security should be defined and audit trails should be utilized to review, detect, report, and address data integrity issues\n- appropriate data life cycle management processes should ensure the integrity of the data throughout its required retention period\n- a combination of technical and procedural controls should ensure segregation of duties to eliminate role conflicts that can raise concerns about data integrity, including:\n- administrator access\n- control and/or elimination of shared accounts\n- defined user roles with privileges assigned based on the users roles and responsibilities\n\norganizational and technical controls should be implemented within the context of the product and business process. understanding how these data integrity and qms procedures and controls are executed and applied is considered a key indicator of the acceptability of the controls based on the risk to data integrity. for further information on management roles and responsibilities, see section 3.3.2.\n\n### self-assessment\n\na key qms requirement is to have an auditing or self-assessment process to monitor adherence and compliance with the qms and the regulatory requirements of the business. a review of the self-assessment, internal audits, and third party audits reports and observations associated with these activities should provide a measure of the effectiveness of the data integrity controls. reviews can help to:\n\n- identify data integrity issues\n- understand whether they are isolated, repeated, or part of a trend\n- understand whether there are any systemic corporate or quality culture issues\n\nthe self-assessment and audit processes should be designed to identify and address risks to data integrity and gaps in a timely manner. self-assessment and audit processes should also be part of monitoring the overall success and effectiveness of the corporate data integrity program.\n\n### regulatory inspection\n\nregulatory inspection findings can provide a measurement of the level of control of the risk to data integrity, especially if they were conducted by a regulatory agency which has implemented forensic data integrity inspection techniques. data integrity related observations issued for a given site are potential indicators of systemic issues that might exist at other sites within the regulated company. if it is found that similar issues exist at other sites within the regulated company, action plans addressing not addressing that issue in a timely manner could demonstrate a systemic issue and a potential corporate and quality culture issue.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "36628614-89b0-42d6-86bc-0485980e0306": {"__data__": {"id_": "36628614-89b0-42d6-86bc-0485980e0306", "embedding": null, "metadata": {"page_label": "52", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Implementation and Governance of Corporate Data Integrity Program: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidance does the MHRA GMP Data Integrity Definition and Guidance for Industry (March 2015) provide regarding the allocation of effort and resources in the implementation of a corporate data integrity program?\n\n2. How does the ISPE GAMP\u00ae Guide: Appendix M1 suggest a corporate data integrity program's strategy should be utilized by senior management and during audits and inspections?\n\n3. According to the ISPE Records and Data Integrity Guide, what are the key factors to consider when determining the level of effort and resources required for implementing and updating the Quality Management System (QMS) and technical controls for a corporate data integrity program?", "prev_section_summary": "The section discusses the importance of records and data integrity, emphasizing the need for system access and security controls, audit trails, and data life cycle management processes. It also highlights the significance of segregation of duties, self-assessment processes, and regulatory inspections in ensuring data integrity. Key topics include system access, security, audit trails, self-assessment, regulatory inspections, and the importance of identifying and addressing systemic issues within a regulated company. Key entities mentioned are the ISPE Records and Data Integrity Guide, organizational and technical controls, self-assessment processes, and regulatory agencies conducting inspections.", "excerpt_keywords": "ISPE, GAMP, data integrity, corporate governance, regulatory inspections"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m1 records and data integrity\n\n### 6.3.3 effort and resources\n\nthe level of effort and resources required should be considered when implementing a corporate data integrity program. the mhra gmp data integrity definition and guidance for industry (march 2015) [1] states that:\n\n\"the degree of effort and resource applied to the organisational and technical control of data lifecycle elements should be commensurate with its criticality in terms of impact to product quality attributes.\"\n\nthese decisions depend on several factors:\n\n1. the first factor is the outcome of gap assessments and audits of the organizational controls within the qms. significant gaps can require a greater effort to update the qms with the appropriate controls to address those integrity risks. these updates may result in the creation of site and/or local procedures to functionally implement the controls and processes.\n2. the second factor involves the outcome of the gap assessment and audits of the technical controls associated with equipment and computer systems. these could result in updates, reconfiguration, or even replacement of several systems, all of which should be qualified and/or validated. depending on the extent of the changes to these systems, the amount of effort and resources will vary by project and/or system.\n3. the third factor involves the gaps associated with business processes and execution of those processes. these are typically found by executing a detailed business process review and gap assessment with those individuals responsible for executing those processes. business process changes may not be easy, especially when processes and approaches have been in place for a significant time. these types of changes can require both procedural changes and quality and business culture changes in order to implement them. the outcomes of these activities can serve as the basis for developing an initial data integrity strategy and defining the corporate data integrity program.\n\n### 6.4 implementation considerations\n\na well-defined strategy can be the key to success of a corporate data integrity program. this high-level plan for executing the corporate data integrity program should define the approach, timeline, resource requirements, and rationale.\n\na strategy can:\n\n- serve as a mechanism to track progress for senior management\n- provide a documented rationale and plan to outline the program and actions during audits and inspections\n- demonstrate a commitment to identifying and addressing data integrity issues, and establishing a corporate governance process for overseeing these activities.\n- provide a mechanism to ensure multisite alignment of activities and a holistic approach to data integrity compliance\n\ncorporate governance is considered critical to success. executive sponsorship should be identified and established in order to obtain support for a corporate data integrity program.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e0099462-f639-433d-afc1-b067fd19543a": {"__data__": {"id_": "e0099462-f639-433d-afc1-b067fd19543a", "embedding": null, "metadata": {"page_label": "53", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Establishing a Comprehensive Data Integrity Program in a Regulated Company", "questions_this_excerpt_can_answer": "1. What are the four key benefits that a corporate data integrity program can deliver to a regulated company as outlined in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest management should respond to data integrity issues to foster an environment where employees feel safe to report these issues?\n\n3. According to the ISPE Records and Data Integrity Guide, what role does knowledge sharing and training play in the establishment of a data integrity program within a regulated company, and what specific topics should be addressed to build a good data integrity foundation?", "prev_section_summary": "This section discusses the implementation and governance of a corporate data integrity program, focusing on the allocation of effort and resources, key factors to consider, and implementation considerations. The key entities mentioned include the MHRA GMP Data Integrity Definition and Guidance for Industry, ISPE GAMP\u00ae Guide, senior management, audits and inspections, Quality Management System (QMS), technical controls, business processes, and executive sponsorship. The section emphasizes the importance of a well-defined strategy, commitment to addressing data integrity issues, and corporate governance for the success of the program.", "excerpt_keywords": "ISPE, Records, Data Integrity, Corporate, Program"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### 6.4.1 sponsor\n\na sponsor is considered crucial to the overall success and will be required to:\n\n- set the direction\n- define the priorities\n- provide the resources\n- break down organizational barriers\n\nthe sponsor should help executives within a regulated company to be aware of the four key benefits that a corporate data integrity program can deliver including the:\n\n1. financial benefits\n2. reduction of risk\n3. regulatory benefits\n4. legal product liability impact\n\n### 6.4.2 management accountability\n\nmanagement accountability is considered critical to the success of a corporate data integrity program. when management leads by example, they demonstrate the core values of integrity in response to a failure. this can eliminate the fear of management retribution and foster an environment where employees are encouraged to identify and report data integrity issues. management should provide the appropriate resources to ensure data integrity, including personnel, instruments and systems, and sound and understandable business processes. management should acknowledge that some level of data integrity issues have and will occur. human factors contribute to data integrity issues, whether intentional or inadvertent. it is human nature to make mistakes and this should be recognized. management should drive a strategy that focuses on prevention, detection, and response. however, to be successful, development of this strategy requires business process knowledge and ensuring those processes support data integrity requirements. data integrity should be owned by the business and requires cross-functional supervision and participation, including it, quality unit, records management, etc.\n\n### 6.4.3 knowledge sharing and training\n\nknowledge sharing and training are closely related. when a corporate data integrity program is rolled out, there are usually several questions and topics to address and share to help build a good data integrity foundation across the regulated company. these can include:\n\n- what does data integrity mean and how does it apply to my day-to-day business activities?\n- what role does equipment qualification and computerized system validation play in data integrity?\n- how does data integrity relate to regulations such as 21 cfr part 11 and eu gmp annex 11?\n- what are our roles and responsibilities versus those of the regulatory agencies?", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1e95186d-00fa-4d71-aa1b-9d58ee08bf1a": {"__data__": {"id_": "1e95186d-00fa-4d71-aa1b-9d58ee08bf1a", "embedding": null, "metadata": {"page_label": "54", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in a Regulated Company: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What are the recommended strategies for establishing a foundation of knowledge on data integrity within an organization, according to the ISPE GAMP\u00ae Guide: Appendix M1?\n \n2. How does the ISPE Records and Data Integrity Guide suggest addressing the behavioral factors that can negatively impact data integrity in a regulated company?\n\n3. What are the key elements for increasing the likelihood of success in implementing a corporate data integrity program as outlined in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the importance of establishing a comprehensive data integrity program in a regulated company, highlighting the key roles of a sponsor, management accountability, and knowledge sharing/training. The sponsor is responsible for setting direction, defining priorities, providing resources, and breaking down organizational barriers. Management accountability is crucial in fostering an environment where employees feel safe to report data integrity issues, with a focus on prevention, detection, and response. Knowledge sharing and training are essential for building a strong data integrity foundation, addressing topics such as the meaning of data integrity, equipment qualification, regulatory compliance, and roles/responsibilities.", "excerpt_keywords": "ISPE, GAMP, data integrity, regulated company, corporate program"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m1 records and data integrity\n\nmaking information readily available to all levels of an organization can be beneficial. establishing a data integrity knowledge repository or knowledge base can provide historical and current information. leveraging smes early in the process can provide a foundation of knowledge of data integrity to be established and used.\n\ndata integrity should be an integral part of the business processes. this can provide a robust basis for implementing more focused training. users of data should be formally trained to understand their:\n\n- role in maintaining data integrity\n- business processes and the information and the data they generate\n- responsibilities for identifying and escalating concerns regardless of the impact on delivery, quotas, or timelines\n\nquality and compliance roles should have advanced training and an understanding of data integrity requirements to ensure requirements are implemented within systems and processes, as well as support the business processes and business owners.\n\n### behavioral factors\n\nbehaviors can promote and encourage the appropriate actions, or damage and discourage data integrity within a regulated company. for example:\n\n- damaging behavior - cost-saving measures, which may require the sharing of passwords due to limited user license purchases\n- discouraging behavior - poorly conducted investigations that blame human error or end in no assignable cause\n\nthree factors that support fraudulent practice are:\n\n1. pressure\n2. opportunity\n3. rationalization\n\nmetrics that encourage any one of these factors can promote data integrity issues. for example, emphasis on speed versus accuracy and quality; this can force employees to cut corners and focus on the wrong things. other behavioral factors include improvisation, impartiality, and falsification for profit. poorly chosen metrics can also undermine data integrity. see section 3.3.4.\n\n### keys to success\n\nthere is no single approach when it comes to implementing a corporate data integrity program; however, there are some elements that can increase the likelihood of success. corporate data integrity program metrics should be defined and established to:\n\nhelp to realize a positive return on investment. whenever senior management invests time, money, and resources into a program, they expect there to be a return on that investment\nmeasure the success of the corporate data integrity program and demonstrate progress against defined goals", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "97470139-e2ed-4d04-8383-a1d76a2745f2": {"__data__": {"id_": "97470139-e2ed-4d04-8383-a1d76a2745f2", "embedding": null, "metadata": {"page_label": "55", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity: Reporting, Audits, and Review Processes\"", "questions_this_excerpt_can_answer": "1. What specific types of audits are recommended for the success of a corporate data integrity program, according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest handling the review of audit trails in the context of routine data review/approval processes to ensure data integrity?\n\n3. According to the ISPE Records and Data Integrity Guide, what considerations should reviewers take into account when assessing the impact of manual adjustments or alterations to data or metadata on the results or product decisions?", "prev_section_summary": "The section discusses the importance of establishing a foundation of knowledge on data integrity within an organization, addressing behavioral factors that can impact data integrity, and key elements for implementing a successful corporate data integrity program. It emphasizes the need for data integrity to be integrated into business processes, training users on their roles and responsibilities, and ensuring quality and compliance roles are trained on data integrity requirements. The section also highlights behavioral factors that can damage data integrity and outlines key factors for success in implementing a corporate data integrity program, such as defining metrics and demonstrating progress against goals.", "excerpt_keywords": "Records, Data Integrity, Audits, Review Processes, ISPE"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nduring the early stages of the corporate data integrity program, reporting of data integrity issues will increase with better awareness and improved detection, but may distort any metrics. an environment of open reporting should continue to be fostered, even if this distortion is considered as \"bad news\". a reporting process can help to increase success.\n\nthe strategy for the corporate data integrity program should define the reporting expectations to senior management, area business leadership, the program team, as well as operational users. it can be an opportunity to share metrics and progress to date against the plan. it can also identify and communicate issues and provides a mechanism to agree on next steps.\n\naudit processes can be key to the success of a corporate data integrity program. several types of audits should be performed, e.g.:\n\n- initial gap assessment or audit of nonconformance\n- periodic audit of long term data archives\n- supplier qualification audits\n- closeout gap assessment or full audit following program completion\n- ongoing internal quality audits of established data integrity controls to ensure continuing effectiveness and compliance\n\naudits can provide critical information to set a baseline and measure the success of implementation, as well as highlight possible gaps and possible corrections and additions to project scope. for the initial and closeout assessments, the use of an independent auditor should be considered, i.e., someone independent of the core team.\n\nrobust review processes can be key to the success of a corporate data integrity program, including result review and periodic review processes. see section 4.4, appendix m4, and appendix m5.\n\nreview of individual results or sets of results prior to release should include the comparison of results against specification/limits/acceptance criteria. it should also include the evaluation of completeness and correctness of metadata. the review can provide a method to make a judgment about the accuracy and integrity of any manually entered values, as well as review any information associated with any decisions or actions taken.\n\nreviewers should assess and understand the impact that any manual adjustments or alterations to data or metadata might have on the results or product decision, as well as be aware of any changes to method versions used in creation of the result. reviewers should also make an assessment of compliance to rigorous scientific practice and documented procedures. increased result review rigor should be applied for manual adjustments and/or results that are within, but close to, specification limits.\n\naudit trails should be reviewed. the mhra gmp data integrity definitions and guidance for industry (2015) [1] states that: \"audit trail review should be part of the routine data review/approval process, usually performed by the operational area which has generated the data (e.g. laboratory).\"\n\nan audit trail can provide a method for assessing data integrity. appropriate and accessible audit trails can provide a technical means of preventing and detecting data integrity issues, however:\n\n- audit trails may not be easily accessible and/or permanently associated with the result, making this review difficult to complete and the detection of data integrity issues difficult to achieve", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5b06597f-8284-46fb-9a88-17ef88db1f57": {"__data__": {"id_": "5b06597f-8284-46fb-9a88-17ef88db1f57", "embedding": null, "metadata": {"page_label": "56", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity in Computerized Systems: Technology Controls and Periodic Reviews\"", "questions_this_excerpt_can_answer": "1. What specific approach does the ISPE GAMP\u00ae Guide: Appendix M1 recommend for managing the review of large volumes of results and associated audit trails in regulated companies to ensure data integrity?\n \n2. How do technology controls within the framework of the ISPE GAMP\u00ae Guide: Appendix M1 facilitate a risk-based approach to data review, and what are the specific scenarios where these controls are particularly emphasized for detailed review?\n\n3. What are the key components and activities that should be included in the periodic reviews of computerized systems to maintain compliance and validation, as outlined in the ISPE GAMP\u00ae Guide: Appendix M1, and how do these components contribute to ensuring data integrity?", "prev_section_summary": "The section discusses the importance of maintaining data integrity in corporate programs, emphasizing the need for open reporting of data integrity issues. It highlights the key role of audit processes in ensuring data integrity, recommending various types of audits such as initial gap assessments, periodic audits, and supplier qualification audits. The section also emphasizes the importance of robust review processes, including the review of individual results, metadata, and audit trails. Reviewers are advised to assess the impact of manual adjustments or alterations to data on results or product decisions, as well as ensure compliance with scientific practices and documented procedures. Overall, the section provides guidance on reporting, audits, and review processes to ensure data integrity in corporate settings.", "excerpt_keywords": "ISPE, GAMP, data integrity, technology controls, periodic reviews"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m1\n\nrecords and data integrity\n\n- the volume of results generated can present logical and resource challenges for review of associated audit trails and metadata. regulated companies should consider planning for remediation or replacement of systems where audit trails are difficult to review.\n\n### 6.5.1 technology controls\n\ntechnology controls can provide a method to review by exception. this approach applies a risk-based approach to data review, based on alerts to highlight a subset of results requiring additional review. for example, results and data that are within, but close to, the specification limit, and which have been:\n\n1. manually manipulated (i.e., integration of chromatograms)\n2. reprocessed\n\ntechnology controls can highlight situations where critical data has been manually entered or changed. a detailed review should be performed on a subset of the results/data. reviewers should understand that it is their responsibility to determine and document what the minimal level of result review is, and be able to provide a documented rationale for doing so during an audit or regulatory inspection. these types of systems also require validation to verify and document the alert functionality.\n\n### 6.5.2 periodic reviews\n\ncomputerized systems require periodic review to ensure they continue to operate in a manner consistent with their intended use and remain in a compliant and validated state consistent with that use. for further information see ispe gamp(r) 5 [3]. from a data integrity perspective, periodic reviews should include the evaluation of any changes to system configuration that could impact data integrity. it should also focus on system administration activities and control, monitoring, and management of user accounts. other periodic review activities that should be addressed include checking that the:\n\n- system validation records are current and reflect the intended use of the computerized system\n- change control process is functioning properly, to ensure that the risk to data integrity is being considered appropriately when changes are implemented", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f34861e1-e363-410a-8204-7732c4366937": {"__data__": {"id_": "f34861e1-e363-410a-8204-7732c4366937", "embedding": null, "metadata": {"page_label": "57", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Model for Regulated Companies: A Comprehensive Guide to Ensuring Compliance and Quality Assurance", "questions_this_excerpt_can_answer": "1. How does the Data Integrity Maturity Model for Regulated Companies propose assessing the maturity level of a company in relation to data integrity, and what are the key elements considered in this assessment?\n \n2. What specific approach does the Data Integrity Maturity Model recommend for regulated companies to ensure their systems and processes are designed with data integrity in mind, especially before purchasing new technology or systems?\n\n3. How does the Data Integrity Maturity Model utilize a dashboard approach to help regulated companies quickly assess their compliance risks and focus their improvement efforts, and what are the colors used in this dashboard to indicate different levels of compliance risk?", "prev_section_summary": "The section discusses the importance of ensuring data integrity in computerized systems, specifically focusing on technology controls and periodic reviews as outlined in the ISPE GAMP\u00ae Guide: Appendix M1. \n\nKey topics covered include:\n- Challenges in reviewing large volumes of results and associated audit trails\n- Technology controls for reviewing data by exception and highlighting critical data manipulation\n- Validation of technology controls to ensure alert functionality\n- The need for periodic reviews to maintain system compliance and validation\n- Evaluation of system configuration changes that could impact data integrity\n- Focus on system administration activities, user account management, and change control processes to mitigate risks to data integrity.", "excerpt_keywords": "Data Integrity, Maturity Model, Regulated Companies, Compliance, Quality Assurance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## appendix m2 - data integrity maturity model\n\nthis appendix describes an approach to assessing the maturity level of a regulated company in relation to data integrity. maturity process areas are identified and maturity factors are described for aspects related to data integrity.\n\n### maturity model\n\nregulated companies should focus on modifying their processes and systems to use appropriate available technical controls, and evaluate systems for gaps prior to use. where feasible, regulated companies should design record and data integrity into their processes before purchasing systems and technology. purchased systems should be able to be configured to provide adequate data integrity.\n\nthe data integrity maturity model described is a simple representation of the regulated company. it is based on the status of key elements of processes needed for data integrity. regulated companies can use the data integrity maturity model to assess their current state of maturity, and understand actions and improvements required to reach the next maturity level.\n\nthe aim is to improve existing work practices, but not to define what those work practices should be for any given activity or regulated company. maturity improvement activities should be performed to achieve improvement goals tied to quality, compliance, and business objectives, rather than a rating or level.\n\nthe maturity model may also be used as a rapid and efficient, but relatively detailed management indicator, enabling regulated companies to focus resources and effort effectively. this general approach is flexible and may be structured several ways, e.g., by geographical area, site, or department.\n\nfigure 7.1 supports demonstrates a red, amber, and green dashboard approach. see section 3.3.4, which is at a higher level and more focused on key compliance risks.\n\nthis model uses concepts and approaches similar to those of the capability maturity model integration (cmmi), but is not linked or affiliated.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0d6788ef-7977-44b6-bb24-74ba5e887a69": {"__data__": {"id_": "0d6788ef-7977-44b6-bb24-74ba5e887a69", "embedding": null, "metadata": {"page_label": "58", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Model and Risk Reduction Strategies for Regulated Companies", "questions_this_excerpt_can_answer": "1. What are the characteristics of each level in the Data Integrity Maturity Model as outlined in the ISPE GAMP(r) Guide's Appendix M2 Figure 7.1, and how do they differ from one another in terms of policy definition, process establishment, and monitoring practices?\n\n2. How does the ISPE Records and Data Integrity Guide suggest managing and reducing risk in relation to the different levels of data integrity maturity, particularly in terms of the necessity of behavioral, procedural, and technical controls beyond merely having a stated policy?\n\n3. According to the document, how should a regulated company approach the assessment of its data integrity maturity level to ensure clarity and avoid inconsistent results, especially when considering the potential variation in maturity across different sites, business areas, or departments?", "prev_section_summary": "The section discusses the Data Integrity Maturity Model for regulated companies, which focuses on assessing the maturity level of a company in relation to data integrity. Key topics include the importance of designing systems with data integrity in mind, evaluating systems for gaps before use, and using technical controls to ensure data integrity. The model helps companies assess their current state of maturity and identify actions needed to improve. It also utilizes a dashboard approach with red, amber, and green indicators to quickly assess compliance risks. The model is flexible and can be structured by geographical area, site, or department. It is not affiliated with the Capability Maturity Model Integration (CMMI).", "excerpt_keywords": "ISPE, Records, Data Integrity, Maturity Model, Risk Reduction"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m2 figure 7.1: data integrity maturity model\n\n| |level 1|level 2|level 3|level 4|level 5|\n|---|---|---|---|---|---|\n|level 1|undefined|partially defined|defined policy|defined policy|defined policy|\n| |uncontrolled|not formally controlled|and established processes|and established processes|and established processes|\n| |not monitored|not formally monitored|inconsistent application|routine application|proactive continuous improvement|\n| |person dependent|inconsistent monitoring|routine monitoring| | |\n| |red|amber|green| | |\n\nmanaging and reducing risk depends on the establishment of appropriate procedures and the application of appropriate behavioral, procedural, and technical controls.\n\nnote: having a stated policy is not considered sufficient to reduce risk to an acceptable level.\n\nmaturity levels are a continuum, rather than discrete levels or steps, so a regulated company may span more than one level for some areas. regulated companies, or parts of regulated companies, may display characteristics of more than one maturity level.\n\ndifferent sites, business areas, or departments within a regulated company may differ in data integrity maturity. for any assessment of maturity level, the scope should be well defined, to avoid confusion and inconsistent results.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5a2f3111-3656-4477-b90e-6544462461c1": {"__data__": {"id_": "5a2f3111-3656-4477-b90e-6544462461c1", "embedding": null, "metadata": {"page_label": "59", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity: The Role of Organizational Culture and Processes\"", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide define the role of corporate culture in achieving data integrity objectives within pharmaceutical organizations?\n\n2. What specific structures and processes does the ISPE guide recommend for ensuring effective stakeholder engagement in maintaining data integrity in regulated business processes?\n\n3. According to the ISPE guide, what are the key components of a quality management system (QMS) that are focused on patient safety, product quality, and data integrity, and how do they contribute to the overall integrity of records and data within the pharmaceutical industry?", "prev_section_summary": "The section discusses the Data Integrity Maturity Model outlined in the ISPE GAMP(r) Guide, which includes five levels ranging from undefined to proactive continuous improvement. It emphasizes the importance of establishing appropriate procedures and applying behavioral, procedural, and technical controls to manage and reduce risk effectively. The document highlights that having a stated policy is not enough to ensure data integrity and that maturity levels are a continuum, with regulated companies potentially spanning multiple levels in different areas. It also addresses the variation in data integrity maturity across different sites, business areas, or departments within a company and emphasizes the need for a well-defined scope in assessing maturity levels to avoid confusion and inconsistent results.", "excerpt_keywords": "ISPE, Records, Data Integrity, Organizational Culture, Quality Management System"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix m2\n\n|process area|maturity factors|\n|---|---|\n|culture|data integrity understanding and awareness of the importance of data integrity and understanding of data integrity principles|\n|corporate culture and working environment|a culture of willing and open reporting for errors, omissions, atypical results, and willing collaboration to achieve data integrity objectives|\n|quality culture|an environment in which employees habitually follow quality standards, take quality focused actions, and consistently see others doing so|\n|leadership|objectives defined and communicated by executive management|\n|sponsorship|executive management providing appropriate resources and support|\n|structure|appropriate roles and reporting structures|\n|stakeholder engagement|engagement of business process owners, quality assurance, and key supporting technical groups (e.g., it)|\n|data ownership|clear ownership of data and data related responsibilities|\n|policies and standards|defined polices and standards on data integrity|\n|procedures|established procedures defining key activities and processes|\n|awareness and training|awareness and training on regulatory requirements and organizational polices and standards|\n|quality management system|established and effective qms, focused on patient safety, product quality, and data integrity|\n|business process definition|clear and accurate definitions of regulated business processes, covering all key gxp areas, including definition of data and data flows|\n|supplier and service provider management|assessment of suppliers and service providers against agreed standards, and setting up and monitoring of contracts and agreements to deliver those standards|\n|strategic planning and data integrity program planning|executive level strategic planning and programs for improving and/or maintaining data governance and data integrity|\n|communication|communication and change management processes, supported by a suitable repository of information and resources.|\n|regulatory awareness|awareness of applicable regulatory requirements|\n|traceability|traceability to applicable regulatory requirements from, e.g., quality manual, polices, or procedures|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5a5f05bc-67c2-4d0e-ade5-82af60facff7": {"__data__": {"id_": "5a5f05bc-67c2-4d0e-ade5-82af60facff7", "embedding": null, "metadata": {"page_label": "60", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Governance and Integrity for Regulatory Compliance", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide recommend preparing for regulatory inspections to ensure readiness, and what specific responsibilities and documentation does it emphasize?\n \n2. What strategies does the ISPE Records and Data Integrity Guide outline for managing master and reference data to ensure their accuracy, consistency, and control throughout their lifecycle?\n\n3. In the context of the ISPE Records and Data Integrity Guide, what are the key components of an effective audit trail and audit trail review process to ensure the integrity and security of electronic records and data modifications?", "prev_section_summary": "The section discusses the importance of records and data integrity within pharmaceutical organizations, focusing on factors such as corporate culture, quality culture, leadership, stakeholder engagement, data ownership, policies and standards, procedures, awareness and training, quality management systems, business process definition, supplier and service provider management, strategic planning, communication, regulatory awareness, and traceability. It emphasizes the role of organizational culture, processes, and structures in achieving data integrity objectives and maintaining patient safety and product quality within the pharmaceutical industry.", "excerpt_keywords": "ISPE, Records, Data Integrity, Regulatory Compliance, Audit Trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n|process area|maturity factors|\n|---|---|\n|regulatory inspection readiness|preparation for inspection, including responsibilities and inspection readiness documentation|\n|regulatory relationship and communications|effectiveness of communication with regulatory authorities and effectiveness of dealing with concerns and citations|\n|data life cycle definition|data life cycle(s) defined in standards and/or procedures|\n|quality risk management|application of risk management (including justified and documented risk assessments) through the data life cycle|\n|data management processes and tools|established data management processes supported by appropriate tools|\n|master and reference data management|established processes to ensure the accuracy, consistency, and control of master and reference data|\n|data incident and problem management|established processes to deal with data incidents and problems, linked with change management and deviation management as appropriate|\n|access and security management|establishing technical and procedural controls for access management and ensuring the security of regulated data and records|\n|archival and retention|establishing processes for ensuring accessibility, readability, and integrity of data in compliance with regulatory requirements including retention periods|\n|electronic signatures|effective application of electronic signatures to electronic records, where approval, verification, or other signing is required by applicable regulations|\n|audit trail and audit trail review|usable and secure audit trails recording the creation, modification, or deletion of records and data, allowing effective audit trail review as part of normal business|\n|data life cycle supporting processes|processes, or during investigations or periodic review|\n|data auditing|auditing against defined data quality standards, including appropriate techniques to identify data integrity failures|\n|self-inspection|inspection against defined data quality standards, including appropriate techniques to identify data integrity failures|\n|metrics|measuring the effectiveness of data governance and data integrity activities|\n|classification and assessment|data and system classification and compliance assessment activities|\n|computerized system validation and compliance|established framework for achieving and maintaining validated and compliant computerized systems|\n|control strategy|proactive design and selection of controls aimed at avoiding failures and incidents, rather than depending on procedural controls aimed at detecting failure|\n|it architecture|appropriate it architecture to support regulated business processes and data integrity|\n|it infrastructure|qualified and controlled it infrastructure to support regulated computerized systems|\n|it support|documented service model defining responsibilities and the required level of support for regulated computerized systems|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0b590134-d76b-4862-90b1-5833446eb91c": {"__data__": {"id_": "0b590134-d76b-4862-90b1-5833446eb91c", "embedding": null, "metadata": {"page_label": "61", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Assessing Data Integrity Maturity Levels in Organizations: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide categorize the maturity levels of data integrity understanding and awareness within an organization, and what are the specific characteristics of each level?\n\n2. What role does awareness of data integrity principles play in the different maturity levels outlined in the ISPE Records and Data Integrity Guide, and how does this awareness evolve from Level 1 to Level 5?\n\n3. In the context of assessing data integrity maturity levels in organizations, as per the ISPE Records and Data Integrity Guide, what is the significance of having a formal ongoing awareness program at the highest maturity level (Level 5), and how does it differentiate from the lower levels in terms of proactively keeping abreast of industry developments?", "prev_section_summary": "The section discusses the importance of ensuring data governance and integrity for regulatory compliance, as outlined in the ISPE Records and Data Integrity Guide. Key topics include preparing for regulatory inspections, managing master and reference data, effective audit trail and audit trail review processes, data management processes and tools, quality risk management, access and security management, archival and retention processes, electronic signatures, data auditing, self-inspection, metrics, classification and assessment, computerized system validation and compliance, control strategy, IT architecture, IT infrastructure, and IT support. The section emphasizes the need for established processes, documentation, and controls to maintain the accuracy, consistency, and security of electronic records and data throughout their lifecycle.", "excerpt_keywords": "ISPE, Records, Data Integrity, Maturity Levels, Awareness"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix m2\n\n7.2 data integrity maturity level characterization\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|culture| | |\n|data integrity understanding and awareness|awareness of the importance of data integrity and understanding of data integrity principles|level 1: low awareness, limited to smes and specialists|\n| | |level 2: general awareness of the topic, but not fully reflected in working practices|\n| | |level 3: principles reflected in working practices, but not consistently applied|\n| | |level 4: data integrity principles fully incorporated and applied in established processes and practices|\n| | |level 5: formal ongoing awareness program, proactively keeping abreast of industry developments|\n|corporate culture and working environment| | |\n|quality culture| | |\n|governance and organization leadership| | |\n|sponsorship| | |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "139a33c0-43c2-4105-951f-39ba76e1a4f2": {"__data__": {"id_": "139a33c0-43c2-4105-951f-39ba76e1a4f2", "embedding": null, "metadata": {"page_label": "62", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Levels and Factors in ISPE GAMP(r) Guide: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP(r) Guide categorize the maturity levels of data governance roles within an organization's governance and organizational structure, and what criteria define each level?\n\n2. What are the specific criteria used in the ISPE GAMP(r) Guide to evaluate the maturity of stakeholder engagement in the context of data integrity and governance across business processes, quality assurance, and IT departments?\n\n3. According to the ISPE GAMP(r) Guide, how are data ownership responsibilities and their clarity assessed across different maturity levels, and what distinguishes each level in terms of process, system, and data owner identification and documentation?", "prev_section_summary": "The section discusses the data integrity maturity levels within organizations as outlined in the ISPE Records and Data Integrity Guide. It categorizes the levels based on factors such as culture, data integrity understanding and awareness, corporate culture, quality culture, governance, organization leadership, and sponsorship. The maturity levels range from low awareness to a formal ongoing awareness program at the highest level, with each level reflecting the incorporation and application of data integrity principles in working practices. The section emphasizes the importance of awareness of data integrity principles and the evolution of this awareness from Level 1 to Level 5, highlighting the significance of proactively staying informed about industry developments at the highest maturity level.", "excerpt_keywords": "ISPE, GAMP, data integrity, maturity levels, governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m2\n\n### table 7.2: data integrity maturity level characterization (continued) records and data integrity\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|governance and organization structure|appropriate roles and reporting structures|level 1: no consideration of specific data governance in roles and responsibilities|\n| | |(continued)|level 2: data governance roles only recently established or in flux|\n| |level 3: data governance roles established but not always effective| |\n| |level 4: data governance roles are well integrated into the management structures| |\n| |level 5: management reviewing and adapting organizational structures based on experience| |\n|stakeholder engagement|engagement of business process owners, quality assurance, and key supporting technical groups (e.g., it)|data integrity and governance seen as either an it issue or a quality issue. no real process owner involvement. high person dependence|\n| |ad hoc involvement of process owners and quality assurance. process owners, and quality assurance typically involved, but not consistently| |\n| |process owners, quality assurance, and it work together through the data and system life cycles| |\n| |all stakeholders consistently work together to identify further cooperation opportunities, based on experience| |\n|data ownership|clear ownership of data and data related responsibilities|process, system, and data owners not defined. process, system, and data owners identified in few areas. process, system, and data owners typically defined in many, but not all cases, and are well defined and documented|\n| |responsibilities not always clear| |\n| |process, system, and data owner responsibilities considered and clarified during management review| |\n|policies and standards|defined policies and standards on data integrity|no established policies and standards for data integrity. ad hoc policies and standards for data integrity in some cases|\n| |policies and standards exist but not fully integrated into the qms and fully reflected in business processes and practices| |\n| |policies and standards regularly reviewed and improved based on experience| |\n|procedures|established procedures defining key activities and processes|no established procedures for key data integrity related activities. ad hoc procedures for data integrity in some cases|\n| |some procedures and standards exist but not fully integrated into the qms| |\n| |procedures for all key areas regularly reviewed and improved based on experience| |\n|awareness and training|awareness and training on regulatory requirements and organizational policies and standards|no real awareness of regulatory requirements and company policy in this area. some awareness of regulatory requirements and company policy in pockets|\n| |general awareness of well-known regulations, and the existence of company policies| |\n| |comprehensive training program ensures an appropriate level of knowledge of specific regulatory and company requirements| |\n| |formal training needs analysis, taking into account regulatory developments. training effectiveness assessment for ongoing improvement| |\n|quality management system|established and effective qms focused on patient safety, product quality, and data integrity|few procedures in place focused on patient safety, product quality, and data integrity. some established procedures and quality control processes, but not consistently achieving quality goals|\n| |established qms, but compliance and data integrity activities are not fully effective| |\n| |established and effective qms, consistently achieving data integrity goals in support of patient safety and product quality| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "88165dfa-c7b7-45f4-80f6-f27dcf603ed1": {"__data__": {"id_": "88165dfa-c7b7-45f4-80f6-f27dcf603ed1", "embedding": null, "metadata": {"page_label": "63", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Assessment and Improvement Framework: A Comprehensive Guide for Enhancing Data Quality and Reliability", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide categorize the maturity levels of governance and organization in terms of business process definition and its impact on data integrity?\n \n2. What criteria does the ISPE Records and Data Integrity Guide use to assess the management of suppliers and service providers in relation to maintaining data integrity standards across different maturity levels?\n\n3. In the context of strategic planning and data integrity program planning, how does the ISPE Records and Data Integrity Guide describe the progression from no executive level planning to successful data integrity programs achieving stated objectives?", "prev_section_summary": "The section discusses the data integrity maturity levels and factors outlined in the ISPE GAMP(r) Guide. Key topics include governance and organizational structure, stakeholder engagement, data ownership, policies and standards, procedures, awareness and training, and quality management system. The section categorizes maturity levels based on criteria such as roles and reporting structures, engagement of stakeholders, clarity of data ownership responsibilities, establishment of policies and standards, defined procedures, awareness and training on regulatory requirements, and effectiveness of the quality management system. The section provides a detailed overview of the different maturity levels and factors that contribute to data integrity within an organization.", "excerpt_keywords": "ISPE, Records, Data Integrity, Maturity Assessment, Governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix m2\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|governance and organization (continued)|level 1|level 2|level 3|level 4|level 5|\n|business process definition|clear and accurate definitions of regulated business processes, covering all key gxp areas, including definition of data and data flows|few business processes and associated data formally defined and documented|some business processes and data formally defined and documented on an ad hoc basis, either by project or operational groups|most business processes and data defined, but not consistently following established conventions or standards, and not always complete and up to date|business processes and data defined and supported by appropriate tools, and consistently maintained|\n|supplier and service provider management|assessment of suppliers and service providers against agreed standards, and setting up and monitoring of contracts and agreements to deliver those standards|many suppliers and providers with a potential impact on data integrity not assessed or managed|some suppliers and providers with a potential impact on data integrity informally assessed|established process for supplier management, but not applied consistently. data integrity implications not always fully covered by assessments or agreements|\n|strategic planning and data integrity program planning|executive level strategic planning and programs for improving and/or maintaining data governance and data integrity|no planning for data integrity or data governance at executive level|limited planning for data integrity or data governance, typically driven by emergencies|specific corporate data integrity program or equivalent underway|successful data integrity programs achieving stated objectives|\n|communication|communication and change management processes, supported by a suitable repository of information and resources|no communication and change management process for data integrity|some informal and person-dependent communication and change management|formal communication and change management for data integrity in place, but on a per-project or per-site basis, with ad hoc repositories|communication and change management for data integrity integral to qms, supported by tools and central repository|\n|regulatory awareness|awareness of applicable regulatory requirements|no awareness of key regulatory requirements|some awareness of detailed regulatory requirements, based on individual experience and effort|formal regulatory awareness raising underway, including training on regulations and guidance|all staff aware of regulatory requirements affecting their work|\n|traceability|traceability to applicable regulatory requirements from, e.g., quality manual, policies, or procedures|no traceability to regulations|little traceability of policies and procedures to specific regulations|traceability in place but limited to key regulatory requirements|full traceability, e.g., from quality manual or policies, to specific regulatory requirements|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2416b543-a980-428e-8b65-b59ce5e8e854": {"__data__": {"id_": "2416b543-a980-428e-8b65-b59ce5e8e854", "embedding": null, "metadata": {"page_label": "64", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Levels and Processes in Regulatory Compliance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide Appendix M2 Table 7.2 characterize the different levels of maturity in terms of regulatory inspection readiness within the framework of records and data integrity?\n\n2. What specific strategies or processes does the ISPE GAMP\u00ae Guide suggest for improving communication and relationship with regulatory authorities to enhance data integrity maturity levels?\n\n3. According to the ISPE GAMP\u00ae Guide, how are data management processes and tools, including master and reference data management, evaluated and improved across different maturity levels to ensure compliance with regulatory standards in data integrity?", "prev_section_summary": "The section discusses the maturity levels of governance and organization in terms of business process definition, supplier and service provider management, strategic planning and data integrity program planning, communication, regulatory awareness, and traceability. It outlines the characteristics of each maturity level and the criteria used to assess them, providing a comprehensive guide for enhancing data quality and reliability. The section emphasizes the importance of clear definitions, assessment of suppliers, strategic planning, communication processes, regulatory awareness, and traceability to ensure data integrity across different levels of maturity.", "excerpt_keywords": "ISPE, GAMP, Data Integrity, Regulatory Compliance, Maturity Levels"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m2 table 7.2: data integrity maturity level characterization (continued) records and data integrity\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|regulatory inspection readiness|(continued) preparation for inspection, including responsibilities, and inspection readiness documentation|level 1|\n|no inspection readiness preparation|level 2| |\n|limited inspection readiness preparation|level 3| |\n|inspection readiness process for activities in place, but inconsistent in level, content, and approach|level 4| |\n|established inspection readiness processes regularly reviewed and refined based on regulatory and industry developments|level 5| |\n|regulatory relationship and communications|effectiveness of communication with regulatory authorities, and effectiveness of dealing with concerns and citations|no communication except during inspections when specific citations are addressed|\n|ad hoc, informal communication as and when required, not following a defined procedure|clear communication lines to key regulatory bodies, with internal specialists following an established process. concerns and citations are proactively managed.| |\n|data life cycle definition|data life cycle(s) defined in standards and/or procedures|data life cycles defined and maintained, supported by effective automated tools|\n|quality risk management|application of risk management (including justified and documented risk assessments) through the data life cycle|quality risk management activities subject to continuous improvement.|\n|data management processes and tools|established data management processes supported by appropriate tools|well established common data management processes, maintained, updated, supported by appropriate automated tools|\n|master and reference data management|established processes to ensure the accuracy, consistency, and control of master and reference data|well established common master/reference data management processes, maintained, updated, supported by appropriate automated tools|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "82eac73c-42df-4a02-b150-f4f8a701ada5": {"__data__": {"id_": "82eac73c-42df-4a02-b150-f4f8a701ada5", "embedding": null, "metadata": {"page_label": "65", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Governance and Compliance Framework: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide classify the maturity levels of data incident and problem management within an organization, and what are the key characteristics that define each level from no formal management to established data incident and problem management systems?\n\n2. According to the ISPE Records and Data Integrity Guide, what are the specific criteria and characteristics that define the maturity levels of access and security management for regulated data and records, ranging from a lack of basic access control to an established integrated system for consistent access and security management?\n\n3. In the context of archival and retention practices as outlined in the ISPE Records and Data Integrity Guide, what are the distinguishing features of each maturity level from having no consideration of long-term archival and retention periods to having archival and data retention policies and processes that are regularly reviewed against regulatory and technical developments?", "prev_section_summary": "The section discusses the data integrity maturity levels and processes in regulatory compliance as outlined in the ISPE Records and Data Integrity Guide. It includes a table (Appendix M2 Table 7.2) that characterizes different levels of maturity in terms of regulatory inspection readiness within the framework of records and data integrity. The key topics covered include regulatory inspection readiness, communication with regulatory authorities, data life cycle definition, quality risk management, data management processes and tools, and master and reference data management. The section highlights the importance of establishing processes, maintaining documentation, and using appropriate automated tools to ensure compliance with regulatory standards in data integrity.", "excerpt_keywords": "ISPE, Records, Data Integrity, Compliance Framework, Data Governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|data life cycle|(continued)|level 1|level 2|level 3|level 4|level 5|\n|data incident and problem management|established processes to deal with data incidents and problems, linked with change management and deviation management as appropriate|no formal data incident and data problem management|some data incident and data problem management|data incidents and problems typically effectively dealt with as a part of normal system or operational incident management|established data incident and problem management|established data incident and problem management|\n|access and security management|establishing technical and procedural controls for access management and to ensure the security of regulated data and records|lack of basic access control and security measures allowing unauthorized changes|some controls, but group logins and shared accounts widespread. password policies weak or not enforced|established standards and procedures for security and access control, but not consistently applied|established system for consistent access control and security management|established integrated system for consistent access control and security management|\n|archival and retention|establishing processes for ensuring accessibility, readability, and integrity of regulated data in compliance with regulatory requirements|no consideration of long term archival and retention periods|no effective process for identifying and meeting regulatory retention requirements|retention policy and schedule defined covering some, but not all regulated records. some systems with no formal archival process|retention schedule includes all regulated records, and those policies supported by appropriate archival and data retention policies and processes|archival and data retention policies and processes regularly reviewed against regulatory and technical developments|\n|electronic signatures|effective application of electronic signatures to electronic records, where approval, verification, or other signing is required by applicable regulations|no control of electronic signatures|lack of clear policy on signature application, and lack of consistent technical support for e-signatures|policies in place. compliant e-signatures in place for all relevant systems, supported by consistent technology where possible|electronic signature policies and processes regularly reviewed against current best practice and technical developments|\n|audit trail and audit trail review|usable and secure audit trails recording the creation, modification, or deletion of data and records, allowing effective review either as part of normal business process or during investigations|lack of effective and compliant audit trails|some limited use of audit trails. often incomplete or not fit for purpose (e.g., in content and reviewability). not typically reviewed as part of normal business process|audit trail in place for most regulated systems, but with undefined and inconsistent use within business processes in some cases|effective audit trail in place for all regulated systems, and use and review of audit trail included in established business processes|audit trail policies and use regularly reviewed against regulatory and technical developments|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ba016021-8d67-4488-93e4-501c0a5b7d7b": {"__data__": {"id_": "ba016021-8d67-4488-93e4-501c0a5b7d7b", "embedding": null, "metadata": {"page_label": "66", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Maturity Levels and Processes: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide Appendix M2 Table 7.2 classify the maturity levels of data life cycle supporting processes in terms of auditing against defined data quality standards?\n \n2. What criteria does the ISPE GAMP\u00ae Guide Appendix M2 Table 7.2 use to differentiate between the five levels of maturity in self-inspection practices for data integrity and quality within an organization?\n\n3. According to the ISPE GAMP\u00ae Guide Appendix M2 Table 7.2, what are the key factors and their corresponding maturity level characterizations for establishing a framework for achieving and maintaining validated and compliant computerized systems that support or maintain regulated records and data?", "prev_section_summary": "The section discusses the maturity levels of data incident and problem management, access and security management, archival and retention practices, electronic signatures, and audit trail review as outlined in the ISPE Records and Data Integrity Guide. It outlines the key characteristics that define each maturity level within these areas, ranging from no formal management to established systems for consistent management. The section emphasizes the importance of establishing processes and controls to ensure data integrity, security, compliance with regulatory requirements, and effective audit trail review within an organization.", "excerpt_keywords": "ISPE, GAMP, data integrity, maturity levels, audit trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m2 table 7.2: data integrity maturity level characterization (continued) records and data integrity\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|data life cycle supporting processes (continued)|auditing|level 1|level 2|level 3|level 4|level 5|\n|auditing against defined data quality standards, including appropriate techniques to identify data integrity failures|no data quality or integrity audits performed|some audits performed on an ad hoc and reactive basis, but no established process for data quality and integrity auditing|data quality and integrity process defined, but audits not always effective and the level of follow up inconsistent|effective data auditing fully integrated into wider audit process and schedule|\n|self-inspection|inspection against defined data quality standards, including appropriate techniques to identify data integrity failures|no data quality or integrity self-inspection performed|some self-inspections performed on an ad hoc and reactive basis, but no established process for data quality and integrity auditing|data quality and self-inspections defined, but self-inspections not always effective and the level of follow-up inconsistent|effective data self-inspections fully integrated into wider business processes|\n|metrics|measuring the effectiveness of data governance and data integrity activities|no data related metrics captured|limited metrics captured, on an ad hoc basis|metrics captured for most key systems and datasets. level, purpose, and use inconsistent|metrics captured consistently, according to an established process for data governance and integrity|\n|classification and assessment|data and system classification and compliance assessment activities|no data classification|limited data classification, on an ad hoc basis. no formal process|data classification performed (e.g., as a part of system compliance assessment), but limited in detail and scope|established process for data classification, based on business process definitions and regulatory requirements|\n|computer system validation and compliance|established framework for achieving and maintaining validated and compliant computerized systems|systems supporting or maintaining regulated records and data are not validated|no formal process for computerized system validation. the extent of validation and evidence dependent on local individuals|most systems supporting or maintaining regulated records and data are validated according to a defined process, but approach is not always consistent between systems and does not fully cover data integrity risks|established process in place for ensuring that all systems supporting and maintaining regulated records and data are validated|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b1136ef5-0a9b-4b98-bd7e-e9f558c818dc": {"__data__": {"id_": "b1136ef5-0a9b-4b98-bd7e-e9f558c818dc", "embedding": null, "metadata": {"page_label": "67", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and IT Infrastructure Support for Regulated Systems", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide classify the maturity levels of control strategy in supporting data life cycle processes, and what criteria define each level from a proactive design perspective to fully integrating integrity into processes before system and technology purchases?\n\n2. What are the specific criteria used in the ISPE Records and Data Integrity Guide to evaluate the maturity levels of IT architecture in supporting regulated business processes and ensuring data integrity, from no consideration of IT architecture strategy to regularly reviewing the IT architecture strategy?\n\n3. According to the ISPE Records and Data Integrity Guide, how is the maturity of IT support for regulated computerized systems assessed, from having no consideration or responsible individual for IT support to regularly reviewing and refining the service level model based on performance, specific concerns, and trends?", "prev_section_summary": "The section provides an excerpt from the ISPE Records and Data Integrity Guide, specifically focusing on data integrity maturity levels and processes. It includes a table (Appendix M2 Table 7.2) that classifies the maturity levels of data life cycle supporting processes in terms of auditing against defined data quality standards, self-inspection practices, metrics, classification and assessment, and computer system validation and compliance. The table outlines the different levels of maturity for each area, from no audits or self-inspections performed to effective integration of data auditing and self-inspections into wider business processes. Key factors such as data quality and integrity audits, metrics capturing, data classification, and system validation are discussed in relation to achieving and maintaining validated and compliant computerized systems that support or maintain regulated records and data.", "excerpt_keywords": "ISPE, Records, Data Integrity, IT Infrastructure, Regulated Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix m2\n\n|maturity area|maturity factors|maturity level characterization|\n|---|---|---|\n|data life cycle supporting processes (continued)|control strategy|proactive design and selection of controls aimed at avoiding failures and incidents, rather than depending on procedural controls aimed at detecting failure|\n| |maturity level characterization|level 1|level 2|level 3|level 4|level 5|\n|data life cycle supporting processes (continued)|control strategy|no consideration of potential causes of data integrity failures and relevant controls|some application of controls, typically procedural approaches|technical and procedural controls applied, but dependent on individual project or system|technical and procedural controls are applied in most cases, based on an established risk-based decision process|integrity fully designed into processes before purchase of systems and technology|\n| |it architecture|appropriate it architecture to support regulated business processes and data integrity|\n| |maturity level characterization|level 1|level 2|level 3|level 4|level 5|\n| |it architecture|no consideration of it architecture strategy|it architecture strategy and decisions not documented, and dependent on local smes|it architecture considered, and generally supports data integrity and compliance, but is typically defined on a system by system basis|established it architecture policy and strategy, with full consideration on how this supports data integrity|it architecture strategy regularly reviewed|\n| |it infrastructure|qualified and controlled it infrastructure to support regulated computerized systems|\n| |maturity level characterization|level 1|level 2|level 3|level 4|level 5|\n| |it infrastructure|no infrastructure qualification performed|no established process for infrastructure qualification. some performed, dependent on local smes|infrastructure generally qualified according to an established process, but is often a document-driven approach, sometimes applied inconsistently|established risk-based infrastructure qualification process, ensuring that current good it practice is applied, supported by tools and technology|infrastructure approach regularly reviewed against industry and technical developments|\n| |it support|documented service model defining responsibilities and the required level of support for regulated computerized systems|\n| |maturity level characterization|level 1|level 2|level 3|level 4|level 5|\n| |it support|no consideration of what support is required with no individual responsible|no defined model. support dependent on experienced individuals|service level model is established per system, but with evidence of inconsistent application, measurement, and reporting|risk-based service level model consistently applied across systems, with evidence of measurement and reporting|service level model regularly reviewed and refined based on performance against targets, specific concerns, and trends|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d391ee38-c10a-4005-9b69-e3cc8a255671": {"__data__": {"id_": "d391ee38-c10a-4005-9b69-e3cc8a255671", "embedding": null, "metadata": {"page_label": "68", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Empty Space: A Lack of Content in Modern Society\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the ISPE Records and Data Integrity Guide as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's storage details within a particular project environment, which would not be commonly available or relevant outside this context.\n\n2. **What is the discrepancy between the creation and last modified dates of the document titled \"Empty Space: A Lack of Content in Modern Society\" found within the ISPE Records and Data Integrity Guide?**\n - This question addresses the document management and version control aspects specific to this file, highlighting the process of content revision and finalization within this project environment.\n\n3. **How does the document titled \"Empty Space: A Lack of Content in Modern Society\" relate to the broader theme of the ISPE Records and Data Integrity Guide within the PharmaWise CSV & Data Integrity project?**\n - Given the unique juxtaposition of a document focused on societal content within a technical guide, this question seeks to understand the thematic or illustrative purpose of such a document within the context of data integrity and compliance in the pharmaceutical industry, which would be a nuanced discussion specific to this collection of documents.\n\nThese questions leverage the detailed metadata and the document's thematic title to inquire about aspects that are uniquely tied to this specific context, such as project management details, content relevance, and thematic integration within a specialized field.", "prev_section_summary": "The section discusses the maturity levels of control strategy, IT architecture, IT infrastructure, and IT support in supporting data integrity and regulated systems according to the ISPE Records and Data Integrity Guide. It outlines the criteria defining each maturity level, from no consideration to fully integrating integrity into processes. The key entities include data life cycle supporting processes, control strategy, IT architecture, IT infrastructure, and IT support, with specific characteristics and levels of maturity outlined for each. The section emphasizes the importance of proactive design, risk-based decision processes, established policies and strategies, infrastructure qualification, and service level models in ensuring data integrity and compliance in regulated systems.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "aa745544-1110-4eca-a641-489f2f1a44fa": {"__data__": {"id_": "aa745544-1110-4eca-a641-489f2f1a44fa", "embedding": null, "metadata": {"page_label": "69", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "The Impact of Corporate and Local Cultures on Data Integrity: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide propose organizations address the impact of both corporate and local cultures on data integrity practices?\n \n2. What specific human factors does the guide identify as critical to enhancing data integrity within pharmaceutical organizations, according to the appendix m3 - human factors?\n\n3. According to the guide, how do different cultural types (corporate and geographic) influence the approach needed to ensure data integrity, and what examples of behavioral attributes are used to classify these cultural impacts?", "prev_section_summary": "The key topics of this section include document storage details, version control, and thematic integration within the PharmaWise Engineer project. The entities mentioned are the ISPE Records and Data Integrity Guide, the document titled \"Empty Space: A Lack of Content in Modern Society,\" and the PharmaWise CSV & Data Integrity project. The section highlights the unique questions that can be answered based on the provided context and metadata, focusing on specific details relevant to this project environment.", "excerpt_keywords": "ISPE, Records, Data Integrity Guide, Corporate Culture, Local Culture"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## appendix m3 - human factors\n\n### 8.1 introduction\n\nconsideration of various human factors is considered critical for effective data integrity. consideration should be given to:\n\n- understanding and mitigating the impact of corporate and local cultures\n- understanding the classification and underlying root cause of incidents (from minor lapses to fraud)\n- implementing mechanisms to minimize human error rates\n- reducing motivation, pressures, and opportunities for data falsification and fraud\n- promoting impartiality in quality-related decision making\n- applying effective behavioral controls - influencing behaviors and attitudes\n\n### 8.2 corporate and local cultures\n\ncultural considerations can refer to a corporate culture, i.e., the model within which an organization operates, or to a local geographic culture, i.e., the moral and behavioral norm within a country or region.\n\n#### 8.2.1 corporate culture\n\ncorporate culture can vary widely from a family-owned private company to a publicly traded corporation; however, from a regulatory perspective, the expectation for data integrity and product quality is the same.\n\n#### 8.2.2 local geographic culture\n\ngeographic culture can have a significant impact on site operations. one system of classification for cultures works on a scale of key behavioral attributes (meyer, 2014) [20].\n\n|effects of different cultural types| |\n|---|---|\n|naturally supporting data integrity|additional effort needed for data integrity|\n|clear; low context communication|high context|\n|relaxed, informal management style|strict; formal|\n|consensus decision making|top down|\n|comfortable giving negative feedback|very uncomfortable|\n|confrontational disagreement|avoids confrontation|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "aeb52f72-31b0-4786-b8fa-9f0107928cbf": {"__data__": {"id_": "aeb52f72-31b0-4786-b8fa-9f0107928cbf", "embedding": null, "metadata": {"page_label": "70", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Cultural Differences and Incident Classification in Data Integrity Management: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide: Appendix M3 suggest management should address the challenge of fostering an environment of openness around data integrity, especially in cultures less inclined towards openness?\n\n2. According to the document, what are the key differences in communication styles between US, Canadian, Australian cultures, and those of China and Japan, and how might these differences impact international collaborations in the context of data integrity management?\n\n3. What approach does the document recommend for mitigating incidents affecting data integrity, especially when multiple incidents have occurred, and how does it suggest incorporating changes to governance, management systems, and behaviors?", "prev_section_summary": "The section discusses the importance of considering human factors in ensuring data integrity within pharmaceutical organizations. It highlights the impact of corporate and local cultures on data integrity practices, emphasizing the need to understand and mitigate cultural influences, reduce human error rates, and promote impartiality in decision-making. The guide distinguishes between corporate culture (varying from family-owned to publicly traded companies) and local geographic culture, which can significantly impact site operations. Examples of behavioral attributes are provided to classify different cultural types and their influence on data integrity efforts.", "excerpt_keywords": "ISPE, GAMP, data integrity, cultural differences, incident classification"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m3 records and data integrity\n\nopenness and a willingness to discuss difficult situations can support an environment where failing results are seen as a group problem to be resolved, with clearly documented corrective actions that mitigate the manufacturing or other root cause. management should help employees to achieve the openness around data integrity that is needed for compliance; significant additional effort will be needed within the cultural types possessing behavioral norms that are naturally less inclined to openness, e.g., those with highly formal management structures combined with an aversion to negative feedback and confrontation.\n\n### 8.2.3 cultural differences\n\ndifferences in corporate culture can lead to difficulties in working. for example, suppliers should allow sufficient time for an extended approval process for a customer that is a regulated company. working across different geographic cultures can create issues and misunderstandings, e.g., cultures may vary in the way they communicate:\n\n- us, canadian, and australian cultures: instinctively based around low-context communication, where the key message is laid out simplistically, assuming very few shared reference points\n- china and japan: using high-context communication that relies on implicit knowledge to fill in the context between the verbal or written phrases and may be based more on body language and linguistic nuance than on clear statements\n\nhigh-context communication across different nationalities may:\n\n- give rise to significant misunderstandings during collaborations\n- make written communication more prone to misinterpretation\n\n### 8.3 classification of incidents\n\nfigure 8.2 provides a useful classification of incidents affecting data integrity [21]. the approach taken to mitigate different types of incidents should consider changes to governance, management systems, and behaviors. if there have been multiple incidents, then root cause investigation should be performed in addition to taking specific actions for individual incidents.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c49d5b70-70ca-4ccf-a470-0fc96cc0098f": {"__data__": {"id_": "c49d5b70-70ca-4ccf-a470-0fc96cc0098f", "embedding": null, "metadata": {"page_label": "71", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Strategies for Minimizing Human Error in Data Integrity", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide classify incidents of data integrity breaches, and what are examples of non-intentional and intentional incidents?\n \n2. According to the ISPE Records and Data Integrity Guide, why do regulators not distinguish between human error and data falsification when assessing the impact of a data integrity failure, and what are the potential impacts of such failures?\n\n3. What specific strategies does the ISPE Records and Data Integrity Guide recommend for minimizing human error in data integrity, and how do these strategies leverage human strengths while mitigating weaknesses?", "prev_section_summary": "The section discusses the importance of openness in addressing data integrity issues, especially in cultures less inclined towards openness. It also highlights the key differences in communication styles between US, Canadian, Australian cultures, and those of China and Japan, and how these differences can impact international collaborations in data integrity management. Additionally, the section provides insights on incident classification in data integrity management, recommending changes to governance, management systems, and behaviors to mitigate incidents and conducting root cause investigations for multiple incidents.", "excerpt_keywords": "ISPE, Records, Data Integrity, Human Error, Incident Classification"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nfigure 8.2: classification of incidents\n\n|non-intentional|intentional|\n|---|---|\n|slips and lapses|mistakes|\n| |situational violation|\n| |optimizing violation|\n| |intentionally misleading|\n|slips and momentary lapses of concentration can be treated as one-off incidents if there is no pattern:|mistakes are often associated with areas of insufficient control or too much complexity:|\n|example: jumping over and forgetting to complete a single manual data-cell entry concerning supplementary information in batch record spreadsheet.|example: excessively long reference codes that when hand-typed lead to errors.|\n| |example: in a rush to get crashed systems up and running again after a storm, an operator deletes a record of analytical sample run that was corrupted by an electrical surge.|\n| |example: ineffective change control leads to modification of an analytical method that was appropriate for one product but not for another product sharing the same method.|\n| |example: change system configuration to hide sampling errors and thereby avoid raising deviations.|\n| |example: inserting \"pass\" values into a laboratory database when individual test results were out of registered specification.|\n\n## human error\n\ndata integrity issues often arise from genuine human error, however regulators do not distinguish between human error and data falsification when assessing the impact of a data integrity failure. any data integrity failure can potentially impact patient safety and/or product quality and, therefore, efforts should be made to minimize human error. human error may be indicative of failures in systems and processes within a regulated company. the root cause of failures may be a combination of failures involving several personnel and across several processes. when transparent, open investigations are performed to determine the true root cause of failures and followed up with effective solutions, the incidence of human error can be reduced. monitoring of human error rates can provide an indicator of the regulated companys error culture. consistently high incidences of error changing little over time could show that mistakes are accepted as inevitable, with no effort made to improve working practices. effective mechanisms to reduce human error rates include:\n\n1. using personnel less: humans are considered naturally poor at manual data entry and, therefore, this should be avoided by implementing direct interfacing of equipment and automated transfer of data.\n2. using personnel only for their strengths: humans are considered highly effective at monitoring multiple systems simultaneously; however, it would require a complex computerized system to achieve the same monitoring function.\n3. limit the opportunity for human error, e.g., by using drop down lists in place of free text entry so that searching for a specific product name will not fail because of a spelling error.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "053ac88c-7713-4384-8777-732db8b2d85c": {"__data__": {"id_": "053ac88c-7713-4384-8777-732db8b2d85c", "embedding": null, "metadata": {"page_label": "72", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Preventing Data Falsification and Fraud in GAMP(r) Guide: Appendix M3 Records and Data Integrity", "questions_this_excerpt_can_answer": "1. What are some common fraudulent activities identified in the ISPE GAMP(r) Guide: Appendix M3 Records and Data Integrity related to data falsification for profit in the pharmaceutical industry?\n \n2. How do administrative and technical controls contribute to reducing the opportunities for data falsification and fraud, according to the ISPE GAMP(r) Guide: Appendix M3, and what role does corporate culture play in the effectiveness of these controls?\n\n3. What components constitute the \"fraud triangle\" as outlined in the ISPE GAMP(r) Guide: Appendix M3 Records and Data Integrity, and how do they contribute to the potential for committing fraud within the context of data integrity and control systems?", "prev_section_summary": "The section discusses the classification of incidents related to records and data integrity, including non-intentional incidents such as slips and lapses, and intentional incidents such as mistakes and violations. It highlights the importance of minimizing human error in data integrity to prevent impacts on patient safety and product quality. The section also provides strategies for reducing human error rates, such as avoiding manual data entry, leveraging human strengths for monitoring tasks, and limiting opportunities for error through system design. The ISPE Records and Data Integrity Guide emphasizes the need for transparent investigations to identify root causes of failures and implement effective solutions to reduce human error.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Fraud Prevention"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m3 records and data integrity\n\n### 8.5 data falsification and fraud\n\n8.5.1 falsification for profit\n\npersonal gain or self-interest has been the motivator in several high-profile fraud cases. commonplace fraudulent activities include:\n\n- unofficial testing to see if a sample will pass before running the \"official\" sample for the batch record\n- concealing, destroying, or overwriting original data and samples\n- re-naming or misrepresenting results from a passing batch in support of other batches\n- manually manipulating chromatography integrations to alter the result\n\nadministrative and technical controls can be used to reduce the opportunities for falsification. the extent and impact of falsification can be magnified if collusion is involved. geographic and corporate cultures can influence the degree to which collusion may be prevented; strongly hierarchical cultures may be more susceptible to collusion as these cultures inherently discourage any disagreement with authority figures.\n\n8.5.2 reducing fraud\n\na framework for understanding how an individual comes to commit fraud is the fraud triangle [23]:\n\nopportunity potentially comes from:\n- weak system controls\n- no restriction on data deletion\n- accessible hard drive\n- using paper records for complex data (potentially missing key dynamic data)\n- lack of effective review processes", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f0a3bbce-445a-43b6-963c-b04145a54a4e": {"__data__": {"id_": "f0a3bbce-445a-43b6-963c-b04145a54a4e", "embedding": null, "metadata": {"page_label": "73", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Enhancing Data Integrity: A Comprehensive Approach through Technical, Behavioral, and Impartial Controls", "questions_this_excerpt_can_answer": "1. What specific guidance does the WHO Annex 5 provide regarding the management governance necessary to ensure data and record management integrity, as referenced in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest managing and mitigating pressure on employees to ensure data integrity and overall product quality, particularly in relation to workplace targets and resource provision?\n\n3. What are the key components of effective data integrity controls as outlined in the ISPE Records and Data Integrity Guide, and how do these controls contribute to preventing data falsification and ensuring the integrity of data management processes?", "prev_section_summary": "The section discusses data falsification and fraud in the pharmaceutical industry, highlighting common fraudulent activities such as unofficial testing, concealing original data, and manipulating results. It emphasizes the importance of administrative and technical controls in reducing opportunities for falsification and the role of corporate culture in preventing collusion. The section also introduces the fraud triangle framework, which identifies opportunity, pressure, and rationalization as factors contributing to fraud.", "excerpt_keywords": "ISPE, Records, Data Integrity, Fraud, Controls"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 71\n\n### records and data integrity appendix m3\n\n* no audit trail review\n\n* approval based on reviewing the paper report only\n\nrobust technical controls within all of the data generation, collection, processing, or storage systems, coupled with effective data review processes can, therefore, reduce the opportunities for fraud.\n\npressure can be managed and mitigated by ensuring that workplace targets are achievable with the equipment and resources that have been provided, and that any metrics monitored focus on data integrity and overall product quality rather than throughput or pass rates.\n\na corporate culture of openness and honesty at all levels, with management appreciation and other employee incentives for highlighting quality issues and concerns, is considered essential to prevent any rationalization that falsifying data can ever be in either the individuals or companys best interests.\n\n### impartiality\n\nthe who annex 5: guidance on good data and record management practices [12] states that:\n\n\"elements of effective management governance should include: ... assurance that personnel are not subject to commercial, political, financial and other organizational pressures or incentives that may adversely affect the quality and integrity of their work;\"\n\nfor example, a qc laboratory supervisor should report through the independent quality assurance department.\n\n### behavioral controls\n\nthe interaction of soft skills needed to guide peoples behavior and responses should be considered as a way to assist supporting data integrity.\n\n#### understanding effective controls\n\neffective data integrity controls include those that:\n\n* do not solely rely on peoples actions\n\n* are built in\n\n* are easy to comply with\n\n* are well communicated and understood\n\n* management will support and enforce\n\n* have backups/contingencies\n\n* make errors/failures clearly visible\n\n* fail over to a safety condition\n\n* focus on resolving problems rather than taking punitive action", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ab29837b-8333-486d-a14b-691acf3491ce": {"__data__": {"id_": "ab29837b-8333-486d-a14b-691acf3491ce", "embedding": null, "metadata": {"page_label": "74", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Data Integrity Training Program: Ensuring Compliance and Accountability in a Regulated Corporate Environment\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide: Appendix M3 suggest a corporate data integrity training program should address the role of management in fostering a quality culture within a regulated company?\n\n2. According to the ISPE GAMP\u00ae Guide: Appendix M3, what specific training content variation does it recommend for large regulated companies versus small regulated companies to effectively address data integrity?\n\n3. What are the specific training recommendations provided in the ISPE GAMP\u00ae Guide: Appendix M3 for data stewards or personnel with QA responsibilities to enhance their understanding of data integrity and regulatory expectations?", "prev_section_summary": "The section discusses the importance of data integrity in pharmaceutical manufacturing and provides guidance on managing and mitigating pressure on employees to ensure data integrity and product quality. Key topics include robust technical controls, managing workplace targets, corporate culture, impartiality, and behavioral controls. The section emphasizes the need for effective data integrity controls that are built-in, easy to comply with, well communicated, and supported by management. It also highlights the importance of creating a culture of openness and honesty to prevent data falsification.", "excerpt_keywords": "ISPE, GAMP, data integrity, training program, regulatory expectations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m3 records and data integrity\n\ncorporate data integrity training program\n8.7.2\n\na corporate data integrity training program should address behavioral factors and drive a strategy that focuses on reducing the risk to data integrity. training can assist in providing personnel with the:\n\n- knowledge and understanding of what data integrity is\n- importance data integrity has for a regulated company\n- personal role each employee has in relation to data integrity\n\ndata integrity training should be implemented at all levels of the corporation in order to have a positive effect on a regulated companys quality culture. management should set an example and foster an environment that promotes and ensures a \"speak up\"/\"quality first\" culture.\n\ntraining should also include practices that support good data governance. for example, establishment of multisite standards for management of supporting data (metadata) that allows analytics that span the entire regulated company.\n\na corporate data integrity training program should be both general and specific. it should target the correct audiences and should consider the specific scale of the regulated company.\n\nin a large regulated company, high-level training for all employees might be at a foundational level, but the content and focus of additional training may vary significantly for different functions, e.g., the consequences of a data integrity issue will differ significantly for a line operator compared to the operations director. this training approach might be ineffective, however, for a small regulated company where both the foundational and detailed training might be more effectively rolled out simultaneously.\n\nat the operator level, data integrity should be inherent within the process and should not be compromised to meet delivery timelines.\n\nusers of data should be formally trained to understand their role in maintaining the integrity of the data they handle. they are normally responsible for highlighting and escalating any concerns about data and quality irrespective of the impact on production quota or deadlines. training provided to personnel creating or using regulated data users of data should ensure:\n\n- understanding of data integrity\n- understanding of data life cycles\n- emphasize good data management\n- emphasize good documentation practices\n\ndata stewards or personnel with qa responsibilities should be given additional training to allow a deeper understanding of technical expectations and requirements, inspection and auditing techniques, and process governance.\n\nit is a regulatory expectation that the data life cycle is understood throughout the regulated companys processes and systems. personnel in roles that own processes and systems should be aware of their role and responsibility in maintaining data integrity, e.g.:\n\n- understanding how and where data is used, and its impact on product quality and patient safety", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3074e5fb-ae43-45f7-bc05-1e124a424e15": {"__data__": {"id_": "3074e5fb-ae43-45f7-bc05-1e124a424e15", "embedding": null, "metadata": {"page_label": "75", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity Through Proper Training and Resource Management: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide suggest addressing the challenge of improvisation in the workplace to ensure data integrity, and what specific strategies are recommended for management to mitigate the drivers of improvisation?\n\n2. What role-based training recommendations does the ISPE Records and Data Integrity Guide provide for enhancing data integrity among quality and compliance personnel, and how does it suggest utilizing US FDA warning letters in this training?\n\n3. According to the ISPE Records and Data Integrity Guide, what are the key aspects that data stewards should be knowledgeable about in the system functionality to maintain data integrity, and how should this knowledge influence their review processes throughout the data life cycle?", "prev_section_summary": "The section discusses the importance of a corporate data integrity training program in a regulated environment, emphasizing the role of management in fostering a quality culture. Key topics include the knowledge and understanding of data integrity, the importance of data integrity for a regulated company, and the personal role of each employee in maintaining data integrity. The section also highlights the need for training at all levels of the corporation, the establishment of good data governance practices, and the specific training recommendations for different functions within large and small regulated companies. Additionally, it mentions the importance of training data stewards and personnel with QA responsibilities to enhance their understanding of data integrity and regulatory expectations.", "excerpt_keywords": "ISPE, Records, Data Integrity, Training, Improvisation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n* knowing what other review processes and data stewards are involved in the data life cycle, particularly those downstream of the system\n\n* indepth knowledge of the system functionality with potential impact on data integrity, and how to detect such activity\n\ntraining on the general principles of data integrity could be complemented by more detailed, contextual training appropriate for data stewards who have a direct role in data handling. the specific training provided to such persons (including quality and compliance personnel) must extend past the general requirements and definitions of data integrity. this role-based training should focus on critical thinking, auditing techniques, and could include specific use cases related to the roles. for example, data integrity training for laboratory auditors and process owners might include a comprehensive review of us fda warning letters that describe data integrity observations in laboratory settings, and practical exercises around examining audit trails. see appendix m4.\n\n## improvisation\n\nimprovisation is the ability to work around a lack of people, absent, or damaged equipment, and even lack of training, to \"get the job done, somehow\". improvisation can be widespread where insufficient or inappropriate resources are commonplace. sops or other controls may not be followed within a culture of improvisation. the integrity of any data produced by improvisation should be considered as questionable. improvisation as a working practice should be discouraged. management should remove the drivers for improvisation by:\n\n- providing sufficient competent people to complete assigned tasks (e.g., overworked personnel may feel pressured to maximize yield or productivity at the expense of data integrity)\n- providing sound, reliable equipment and instrumentation for the production and quality personnel to achieve the expected throughput (e.g., outdated equipment may not provide the technical controls for data integrity nor produce accurate data. frequent equipment downtime can increase pressure on personnel to seek alternative ways keep up with their workload)\n- maintaining the facilities and operating environment in a fit state for their intended purpose (e.g., lack of physical security and poor it infrastructure can jeopardize data integrity, e.g., by allowing unauthorized access to a server room, or by losing data from a local hard drive in a laboratory)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "265c91d5-129c-4172-bee0-de86d9f76028": {"__data__": {"id_": "265c91d5-129c-4172-bee0-de86d9f76028", "embedding": null, "metadata": {"page_label": "76", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Unique Entities and Themes\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the significance of the document titled \"Blank Canvas: A Collection of Unique Entities and Themes\" within the scope of the ISPE Records and Data Integrity Guide?**\n - This question seeks to understand the relevance or role of a document with a seemingly unrelated title to the broader topic of records and data integrity in the pharmaceutical industry, as guided by ISPE standards.\n\n2. **How does the document \"Blank Canvas: A Collection of Unique Entities and Themes\" contribute to the understanding or implementation of data integrity principles in pharmaceutical development, as outlined in the ISPE guide?**\n - This question aims to uncover the specific contributions or insights the document provides towards the practical application of data integrity principles, which is a critical aspect of pharmaceutical development and compliance.\n\n3. **Considering the document's creation and last modified dates in 2024, what new perspectives or updates does it introduce to the existing ISPE Records and Data Integrity Guide framework?**\n - Given the future dates, this question explores what novel content or updates the document might introduce, reflecting advancements or shifts in the field of data integrity and records management in the pharmaceutical industry.\n\nThese questions are designed to probe the unique intersection of the document's content with the broader themes of records and data integrity in pharmaceutical development, leveraging the specific details provided in the context.", "prev_section_summary": "The section discusses the importance of records and data integrity in the pharmaceutical industry, emphasizing the need for knowledge of system functionality and review processes throughout the data life cycle. It also addresses the issue of improvisation in the workplace, highlighting the risks it poses to data integrity and providing strategies for management to mitigate drivers of improvisation. Role-based training recommendations for quality and compliance personnel are outlined, including the use of US FDA warning letters in training. Overall, the section emphasizes the importance of proper training and resource management in ensuring data integrity in pharmaceutical operations.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Industry, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "21e122af-aea6-4d94-95d6-35445476f784": {"__data__": {"id_": "21e122af-aea6-4d94-95d6-35445476f784", "embedding": null, "metadata": {"page_label": "77", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Risk-Based Approach to Data Audit Trails and Audit Trail Review in GxP Regulated Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific regulatory requirements are outlined for audit trails in the context of GxP regulated computerized systems, and how do these requirements relate to EU GMP Annex 11 and 21 CFR Part 11?\n \n2. How does the document differentiate between data audit trails and technical system logs in the context of GxP regulated environments, and why is this distinction important for ensuring data integrity and compliance with existing regulations?\n\n3. What are the three main types of data audit trail review recommended in the document for GxP regulated computerized systems, and how does each type contribute to maintaining data integrity and supporting regulatory compliance?", "prev_section_summary": "The section discusses a document titled \"Blank Canvas: A Collection of Unique Entities and Themes\" within the context of the ISPE Records and Data Integrity Guide. It raises questions about the significance of the document, its contribution to understanding data integrity principles in pharmaceutical development, and the potential updates it introduces to the existing ISPE framework. The section aims to explore the unique intersection of the document's content with the broader themes of records and data integrity in the pharmaceutical industry.", "excerpt_keywords": "ISPE, GxP regulated, data audit trails, audit trail review, regulatory compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 75\n\n## records and data integrity appendix m4\n\n### appendix m4 - data audit trail and audit trail review\n\n### 9.1 introduction\n\nthis appendix describes a risk-based approach to data audit trails and audit trail review for gxp regulated computerized systems. it places audit trails in the wider context of information security, and suggests a practical approach for audit trails and audit trail review within that wider framework. it outlines the current regulatory requirements for audit trails, as defined in eu gmp annex 11 [7], 21 cfr part 11 [2], and associated guidance. audit trails should be specified, implemented, and controlled. audit trails can be useful in supporting routine in-process reviews of critical electronic records, and as investigative tools. for example, in chromatography systems audit trails can show changes to methods associated with reprocessed results, therefore, helping a reviewer to identify instances of testing (or processing) into compliance.\n\nreviews should be performed of audit trail content that has a direct impact on reported values that will be used for product or patient decisions. general routine, historical, retrospective, non-targeted reviews of audit trail content should be avoided and can use significant resources and may not discover atypical data. examining audit trails for a specific set of records as part of an in-process review, or during an investigation where data integrity has been determined to be uncertain can help to determine the integrity of the records in question.\n\nthere are three main types of data audit trail review:\n\n1. review of data audit trails as part of normal operational data review and verification\n2. review of audit trails for a specific data set during an investigation (e.g., of deviations or data discrepancies)\n3. review and verification of effective audit trail functionality (e.g., verification of audit trail configuration as part of periodic review)\n\nholistic and risk management based decisions on the need for audit trails and the review of audit trails should be based upon:\n\n- a detailed understanding of the process supported by the computerized system\n- applicable gxp requirements\n- the risk to patient safety, product quality, and data integrity\n\ndata audit trails normally record the creation, modification, or deletion of records and data. technical system logs normally record various system, configuration, and operational events. technical system logs should not be regarded as equivalent to data audit trails. this distinction is consistent with existing regulations and normal it good practice and terminology. where systems lack appropriate audit trails, alternative arrangements to verify the accuracy of data should be implemented, e.g., administrative procedures, secondary checks, and controls. technical system logs may be helpful, e.g., in case of investigations or in the absence of true audit trails.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "861e2aab-9812-4285-9849-7a1250c34b8d": {"__data__": {"id_": "861e2aab-9812-4285-9849-7a1250c34b8d", "embedding": null, "metadata": {"page_label": "78", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Compliance: Audit Trails, Time Stamps, and Security Measures\"", "questions_this_excerpt_can_answer": "1. What specific guidance does the MHRA GMP Data Integrity Definitions and Guidance for Industry (2015) provide regarding the design of computerized systems for capturing, processing, reporting, or storing raw data electronically in terms of audit trails?\n \n2. How does the US FDA 21 CFR Part 11 section 11.10 (e) define the requirements for audit trails in relation to electronic records, and what are the expectations regarding the retention and accessibility of such audit trail documentation?\n\n3. According to the FDA Guidance for Industry: Part 11, Electronic Records; Electronic Signatures - Scope and Application, what factors should companies consider when deciding to apply audit trails or other measures to ensure the integrity of electronic records, and how should time stamps in audit trails be implemented in terms of time zone references?", "prev_section_summary": "This section discusses the risk-based approach to data audit trails and audit trail review in GxP regulated computerized systems. It outlines the regulatory requirements for audit trails as defined in EU GMP Annex 11 and 21 CFR Part 11, and emphasizes the importance of audit trails in maintaining data integrity and compliance. The section differentiates between data audit trails and technical system logs, highlighting the importance of this distinction for ensuring data integrity. It also describes the three main types of data audit trail review recommended for GxP regulated systems, emphasizing the need for a detailed understanding of the process, applicable GxP requirements, and the risk to patient safety, product quality, and data integrity when making decisions about audit trails. Additionally, the section discusses the role of audit trails in supporting routine reviews of critical electronic records and as investigative tools, and provides guidance on when and how to review audit trail content to ensure data integrity.", "excerpt_keywords": "ISPE, GAMP, Data Integrity, Audit Trails, Electronic Records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m4 records and data integrity\n\n9.2 regulatory background\n\nthe mhra gmp data integrity definitions and guidance for industry (2015) [1], states that:\n\n- \"where computerised systems are used to capture, process, report or store raw data electronically, system design should always provide for the retention of full audit trails to show all changes to the data while retaining previous and original data. it should be possible to associate all changes to data with the persons making those changes, and changes should be time stamped and a reason given. users should not have the ability to amend or switch off the audit trail.\n- the relevance of data retained in audit trails should be considered by the company to permit robust data review/verification. the items included in audit trail should be those of relevance to permit reconstruction of the process or activity. it is not necessary for audit trail review to include every system activity (e.g. user log on/off, keystrokes etc.), and may be achieved by review of designed and validated system reports.\n- audit trail review should be part of the routine data review/approval process, usually performed by the operational area which has generated the data (e.g. laboratory). there should be evidence available to confirm that review of the relevant audit trails have taken place. when designing a system for review of audit trails, this may be limited to those with gmp relevance (e.g. relating to data creation, processing, modification and deletion etc). audit trails may be reviewed as a list of relevant data, or by a validated exception reporting process. qa should also review a sample of relevant audit trails, raw data and metadata as part of self-inspection to ensure ongoing compliance with the data governance policy/procedures.\"\n\nus fda 21 cfr part 11 [2], in section 11.10 (e), requires:\n\n- \"use of secure, computer-generated, time-stamped audit trails to independently record the date and time of operator entries and actions that create, modify, or delete electronic records. record changes shall not obscure previously recorded information. such audit trail documentation shall be retained for a period at least as long as that required for the subject electronic records and shall be available for agency review and copying.\"\n\nnote: this requirement specifically covers operator entries and actions that create, modify, or delete regulated electronic records, but not all activities performed by users, and not all system actions.\n\nin the fda guidance for industry: part 11, electronic records; electronic signatures - scope and application [16], fda clarifies their expectations and interpretation:\n\n- \"we recommend that you base your decision on whether to apply audit trails, or other appropriate measures, on the need to comply with predicate rule requirements, a justified and documented risk assessment, and a determination of the potential effect on product quality and safety and record integrity. we suggest that you apply appropriate controls based on such an assessment. audit trails can be particularly appropriate when users are expected to create, modify, or delete regulated records during normal operation.\"\n\nthe guidance clarifies that when applying time stamps (such as in audit trails), they should be implemented with a clear understanding of the time zone reference used. in such instances, system documentation should explain time zone references as well as zone acronyms or other naming conventions.\n\nthe guidance also notes that audit trails may be just one among various physical, logical, or procedural security measures in place to ensure the trustworthiness and reliability of the records, within the context of a wider information security management framework.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6706fe45-5896-4fae-8559-315d05704ae9": {"__data__": {"id_": "6706fe45-5896-4fae-8559-315d05704ae9", "embedding": null, "metadata": {"page_label": "79", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Compliance through Audit Trails in GMP Regulations: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does EU GMP Annex 11, revised in 2011, address the creation and documentation of audit trails for GMP-relevant data changes or deletions, and what specific requirements does it impose on electronic systems regarding the recording of the identity of individuals making these changes?\n\n2. What are the guidelines provided by the ISPE Records and Data Integrity Guide for implementing audit trails in electronic systems to ensure data integrity and compliance with GxP regulations, and how does it suggest handling the inclusion of information within these audit trails based on risk assessment?\n\n3. According to the ISPE Records and Data Integrity Guide, how should modifications or deletions of regulated records and data be documented in both traditional paper-based systems and electronic systems to meet GxP requirements, and what specific elements should an audit trail record to provide equivalent traceability?", "prev_section_summary": "The section discusses the regulatory background and requirements for audit trails in computerized systems for capturing, processing, reporting, or storing raw data electronically. It includes guidance from the MHRA GMP Data Integrity Definitions and Guidance for Industry, US FDA 21 CFR Part 11, and FDA Guidance for Industry: Part 11. Key topics covered include the design of systems to retain full audit trails, time-stamped changes to data, retention and accessibility of audit trail documentation, factors to consider when implementing audit trails, and the importance of time zone references in time stamps. The section emphasizes the need for audit trails to ensure data integrity and compliance with regulatory requirements.", "excerpt_keywords": "Data Integrity, Audit Trails, GMP Regulations, Electronic Systems, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\neu gmp annex 11, as revised in 2011, includes the following clause (with a focus on gmp relevant data changes or deletions):\n\n- \"audit trails\nconsideration should be given, based on a risk assessment, to building into the system the creation of a record of all gmp-relevant changes and deletions (a system generated \"audit trail\"). for change or deletion of gmp-relevant data the reason should be documented. audit trails need to be available and convertible to a generally intelligible form and regularly reviewed.\"\n\neu gmp annex 11 also requires electronic systems to record the identity of the person creating an electronic record along with the date and time. this is required even when there is no audit trail. eu gmp annex 11 requires that risk management is applied throughout the life cycle of a computerized system, which supports a review by exception approach provided that the audit trail of the application supports identification of changed records easily and that this function has been validated.\n\nsome technical and system logs may be used in support of compliance and investigations, especially in the absence of true audit trails; however, these are not intended to be audit trails in the sense that 21 cfr part 11 and eu gmp annex 11 require and considering them as such may increase compliance risk.\n\n### application and use of audit trails\n\ngxp regulations require traceability of creation, modification, or deletion of regulated records and data.\n\nin a traditional paper-based system, gxp requirements would typically be implemented as follows:\n\n- if a user recognizes that a specific data entry is wrong, they strike out the wrong data in a way that it is still readable and put the correct value next to it with their initials, the date, and in some cases the reason\n\nfor further information see the applicable requirements for good documentation practice in eu gmp chapter 4.\n\nin an electronic system, an audit trail is designed to provide equivalent traceability. the need for, and the type and extent of, audit trails should be based on a documented and justified risk assessment. specific gxp requirements requiring audit trails may also apply. an audit trail is particularly appropriate when users create, modify, or delete records and data during normal operation. the audit trail should record the:\n\n1. initial values at creation\n2. modifications and deletions\n3. reason(s) for such modification or deletion\n\ndecisions on which items to include in the data audit trail, if configuration is available, should be based on risk assessment and specific gxp requirements. including unnecessary information should be avoided and can increase, rather than decrease, compliance risk. with the exception of entering a reason for a change, audit trails should be automated, i.e. all audit trail functions should be executed without user intervention, and secure. audit trails should be secure from unauthorized change. an electronic data audit trail is useful for records and data. other methods, e.g., change control records, may be appropriate for lower impact records and data.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "036b8345-a569-4aa8-bf7a-02180293a215": {"__data__": {"id_": "036b8345-a569-4aa8-bf7a-02180293a215", "embedding": null, "metadata": {"page_label": "80", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Best Practices for Managing Audit Trails in Electronic Recordkeeping Systems\"", "questions_this_excerpt_can_answer": "1. What specific elements should be included in audit trail information according to the ISPE GAMP\u00ae Guide to ensure compliance with best practices for managing electronic recordkeeping systems?\n\n2. How does the ISPE GAMP\u00ae Guide recommend managing audit trails when the electronic system is incapable of creating one, and what are the considerations for alternative solutions?\n\n3. What are the procedural and logical controls recommended by the ISPE GAMP\u00ae Guide for the effective management and verification of audit trail functionality within electronic recordkeeping systems?", "prev_section_summary": "The section discusses the importance of audit trails in ensuring data integrity and compliance with GxP regulations, specifically focusing on EU GMP Annex 11 requirements for electronic systems. It highlights the need for a risk assessment to determine the creation of audit trails for GMP-relevant data changes or deletions, as well as the recording of the identity of individuals making these changes. The section also outlines guidelines for implementing audit trails in electronic systems, including the documentation of modifications or deletions of regulated records and data. Key topics include the application and use of audit trails in traditional paper-based and electronic systems, the elements that should be recorded in an audit trail, and the importance of conducting a risk assessment to determine the extent of audit trail requirements.", "excerpt_keywords": "ISPE, GAMP, audit trails, electronic recordkeeping, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nispe gamp(r) guide:\n\nappendix m4 audit trail information on process operations may form part of the electronic record and should be reviewed during the approval process of the electronic record. in such cases a separate audit trail may not be required.\n\nrecords (e.g., instructions and summary reports) typically contain a history embedded in the document itself. a separate audit trail intended to be the equivalent of a document change history log is not normally required.\n\naudit trail information should include the following:\n\n- the identity of the person performing the action\n- in the case of a change or deletion, the detail of the change or deletion, and a record of the original entry\n- the reason for any gxp change or deletion\n- the time and date when the action was performed\n\nlogical controls should be established for the management of audit trails, including limitations to the ability to deactivate, change, or modify the function of the audit trails. procedural controls may also be needed. controls should cover:\n\n- initial verification of audit trail functionality and subsequent verification during change management\n- management, monitoring, and periodic verification of audit trail configuration according to established procedures\n- preventing configuration of audit trails by persons with normal user privileges or approval responsibility\n- preventing turning off of audit trails (except for well-documented purposes related to maintaining or upgrading the system, during which time normal user access should be prevented)\n- where an audit trail is deemed necessary, but the system is incapable of creating an audit trail, other measures (e.g., a log book) should be implemented. as this is not automated and independent of the operator, it should be regarded as a less desirable option and regulated companies should consider alternative solutions.\n- ensuring that system clocks used for time stamps are accurate and secure\n- effective segregation of duties and related role based security, (e.g., system administrator privileges should be restricted to individuals without a conflict of interest regarding the data)\n- ensuring that any change to audit trail configuration or settings is documented and justified, and captured in automatic system logs, where possible\n- periodic checks that audit trails remain enabled and effective\n- established and effective procedures for system use, administration, and change management\n\naudit trails should be regarded as only one element in a wider framework of controls, processes, and procedures aimed at an acceptable level of record and data integrity.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "72e154ef-aedb-4daf-8db1-ab7564845a2b": {"__data__": {"id_": "72e154ef-aedb-4daf-8db1-ab7564845a2b", "embedding": null, "metadata": {"page_label": "81", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity through Audit Trail Review: Technical Challenges and Solutions", "questions_this_excerpt_can_answer": "1. What are the three main types of audit trail review as outlined in the ISPE Records and Data Integrity Guide, and how do they differ in their application within operational and investigational contexts?\n\n2. According to the ISPE Records and Data Integrity Guide, what are some of the key issues that an effective audit trail review aims to identify in order to maintain data integrity, and how should the review be contextualized within business processes?\n\n3. What challenges are associated with accessing and interpreting electronic audit trails as described in the ISPE Records and Data Integrity Guide, and what recommendations are provided to enhance the usability of audit trail data for ensuring data integrity?", "prev_section_summary": "The section discusses the importance of audit trail information in electronic recordkeeping systems according to the ISPE GAMP\u00ae Guide. Key topics include the elements that should be included in audit trail information, the management of audit trails when the system is incapable of creating one, procedural and logical controls for audit trail functionality, and the wider framework of controls, processes, and procedures for record and data integrity. Entities mentioned include the identity of the person performing actions, details of changes or deletions, reasons for changes, time and date of actions, logical controls, procedural controls, system clocks for time stamps, segregation of duties, system administrator privileges, documentation of changes to audit trail configuration, and periodic checks for audit trail effectiveness.", "excerpt_keywords": "ISPE, Records, Data Integrity, Audit Trail, Electronic Audit Trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### 9.4 audit trail review\n\nthere are three main types of audit trail review:\n\n|1.|review of data audit trails as part of normal operational data review and verification, second person verification and approval, usually performed by the operational area which has generated the data (e.g., a laboratory), i.e., using the audit trail routinely.|\n|---|---|\n|2.|a tool to be used for investigation (e.g., of deviations or data discrepancies) as and when required, i.e., using the audit trail as and when needed.|\n|3.|review of audit trail functionality (as part of normal periodic review or audit) to check that they remain enabled and effective, i.e., checking the audit trail.|\n\nmost audit trail reviews are of the first type, i.e., reviews conducted as part of normal operational data review. they may form part of a second person review (e.g., as required by 21 cfr part 211.194 (a)(8) [24] for laboratory systems). such reviews should not focus only on audit trails to the exclusion of other records, which may be especially important in hybrid situations. the review should cover the overall process from record generation to calculation of reportable results, which may cross system boundaries as well as the associated external records, and may cover several records and audit trails.\n\nthe objective of reviewing audit trails is to identify potential issues that may result in loss of data integrity. issues may include:\n\n- erroneous data entry\n- modifications by unauthorized persons\n- data not entered contemporaneously\n- falsification of data\n\nthe review should be performed within the context of the business process, to be effective in identifying such problems.\n\n### 9.5 technical aspects and system design\n\nproperly specified, implemented, and controlled audit trails are useful in supporting routine in-process reviews, and as investigative tools, but current electronic audit trail solutions vary in degree of effort required to access and interpret them. some common challenges with audit trail solutions include:\n\n- audit trails may require specialist tools to access them and are not readily available to system users\n- system logs may need to be translated from technical data to business information\n- audit trails may be very extensive and identifying specific required information is difficult\n- audit trails may contain much information that is irrelevant from the perspective of the main objective of seeking to ensure data integrity\n\nfor enhanced usability, if available, systems should be configured to allow the search, sorting, and filtering of audit trail data. it should be recognized, however, that applications may not support this.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1e70dfad-28dd-4044-950e-3542fbbc8ac2": {"__data__": {"id_": "1e70dfad-28dd-4044-950e-3542fbbc8ac2", "embedding": null, "metadata": {"page_label": "82", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Maximizing Audit Trail Functionality for Regulated Companies: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What considerations should regulated companies have when implementing systems with audit trails that are not fully under their control, according to the ISPE GAMP\u00ae Guide: Appendix M4?\n\n2. How does the ISPE GAMP\u00ae Guide: Appendix M4 suggest regulated companies address the challenges of supporting in-process or periodic review of audit trail information?\n\n3. What recommendations does the ISPE GAMP\u00ae Guide: Appendix M4 offer to regulated companies for encouraging suppliers to enhance audit trail functionality and data analysis tools?", "prev_section_summary": "The section discusses audit trail review as a key aspect of ensuring data integrity in pharmaceutical operations. It outlines three main types of audit trail review: operational data review, investigation tool, and audit trail functionality check. The review aims to identify issues such as erroneous data entry, unauthorized modifications, lack of contemporaneous data entry, and data falsification. The importance of contextualizing the review within the business process is emphasized. Technical challenges in accessing and interpreting electronic audit trails are also highlighted, including the need for specialist tools, translation of technical data to business information, and the extensive nature of audit trail data. Recommendations for enhancing usability include configuring systems for search, sorting, and filtering of audit trail data.", "excerpt_keywords": "ISPE, GAMP, audit trail, data integrity, regulated companies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\npage 80\n\nispe gamp(r) guide: appendix m4\nregulated companies implementing and using the purchased systems should consider specific details of the available audit trail that may not be under their control. solutions may technically provide the required information, but it may be difficult and costly to support in-process or periodic review of audit trail information. regulated companies should encourage and support suppliers to develop useful audit trail functionality and provide effective data analysis tools.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e4c238c9-b200-4e21-bf91-7fb83ebbcfdb": {"__data__": {"id_": "e4c238c9-b200-4e21-bf91-7fb83ebbcfdb", "embedding": null, "metadata": {"page_label": "83", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Auditing and Review Processes in ISPE GAMP(r) Guide: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What specific types of audits are identified as necessary components of an effective data integrity program according to the ISPE GAMP(r) Guide, and how do they contribute to maintaining data integrity within a regulated company?\n\n2. How does the ISPE GAMP(r) Guide suggest utilizing mock inspections and data tracing exercises to identify potential weaknesses or areas of non-compliance in a company's data handling processes?\n\n3. According to the ISPE GAMP(r) Guide, what role do human reviewers play in the data integrity auditing and review process, particularly in assessing the scientific validity of changes noted in an audit trail?", "prev_section_summary": "The section discusses the considerations for regulated companies when implementing systems with audit trails that are not fully under their control, as outlined in the ISPE GAMP\u00ae Guide: Appendix M4. It highlights the challenges of supporting in-process or periodic review of audit trail information and offers recommendations for encouraging suppliers to enhance audit trail functionality and data analysis tools. Key entities mentioned include regulated companies, suppliers, and audit trail functionality.", "excerpt_keywords": "ISPE, GAMP, data integrity, auditing, review processes"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 81\n\n### records and data integrity appendix m5\n\n#### appendix m5 - data auditing and periodic review\n\n10.1 introduction\n\nsoftware applications can provide an audit trail, but only a human can decide whether an integration parameter change (noted in the audit trail) is scientifically valid; therefore, review processes remain in the human domain. review processes can be discrete or continuous, one off, or repeated and scheduled or unscheduled.\n\n10.2 auditing for data integrity\n\nauditing for data integrity goes beyond the typical internal quality auditing necessary for an effective quality system. types of audits required in an effective data integrity program include:\n\n- initial gap assessment or audit where a regulated company is/is not complying with data integrity control requirements and best practices\n- ongoing internal quality audits of established data integrity controls to ensure continuing effectiveness and compliance\n- periodic audits of long term data archives to verify the data deterioration and media migration controls are being followed and are effective\n- supplier qualification audits for suppliers creating, modifying, reviewing, analyzing, transmitting, storing, and/or archiving data on behalf of a regulated company\n- closeout gap assessment or full audit following (or close to) completion of data integrity program implementation\n\nexamples of data integrity auditing exercises could include:\n\n1. conducting a mock inspection of a specific data handling process, where the entire data life cycle would need to be explained as if it was being presented to a regulatory inspector. this can highlight any confusion about where the data resides and how it passes from one system to another, and may identify areas of weakness.\n2. picking a single result and tracing it back through to the raw data, including any laboratory notebook entries. verifying the data integrity and audit trail at each step, and demonstrating that all raw data, paper or electronic, is readily retrievable and fully support the final result and is consistent with any summary data filed with the regulatory agencies as part of a drug master file or new drug application.\n3. repeating example 2 above in the opposite direction to verify that all data has been processed and reported, and to confirm that there is no orphan data which could be indicative of trial injections or other malpractices.\n\nfurther proactive data audit activities could be based on the regulators own guidance, e.g., us fda compliance policy guide manual 7346.832 on pre-approval inspections (fda, 2010) [25] suggests that inspectors should:\n\n- review data on finished product stability, dissolution, content uniformity, and api impurity\n- determine if data was not submitted to the application that should have been", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ddeb2730-9672-403c-b80e-d0050bdf6f97": {"__data__": {"id_": "ddeb2730-9672-403c-b80e-d0050bdf6f97", "embedding": null, "metadata": {"page_label": "84", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity Through Periodic Review and Monitoring in GAMP\u00ae Guide: Appendix M5\"", "questions_this_excerpt_can_answer": "1. What specific types of inconsistencies should be looked for in manufacturing documents as part of ensuring data integrity according to the ISPE GAMP\u00ae Guide: Appendix M5?\n \n2. How does the ISPE GAMP\u00ae Guide: Appendix M5 suggest monitoring human behavior and the effectiveness of technical controls during a system's periodic review?\n\n3. What are the recommended steps for ensuring ongoing assurance of data integrity through the review of personnel records and system administrator logs as outlined in the ISPE GAMP\u00ae Guide: Appendix M5?", "prev_section_summary": "The section discusses the importance of data auditing and periodic review in maintaining data integrity within a regulated company, as outlined in the ISPE GAMP(r) Guide. It highlights the types of audits required, such as initial gap assessments, ongoing internal quality audits, periodic audits of data archives, supplier qualification audits, and closeout gap assessments. The section also emphasizes the use of auditing exercises like mock inspections and data tracing to identify weaknesses in data handling processes. Human reviewers play a crucial role in assessing the scientific validity of changes noted in an audit trail. The section also mentions proactive data audit activities based on regulatory guidance, such as the US FDA compliance policy guide manual.", "excerpt_keywords": "ISPE, GAMP Guide, Data Integrity, Periodic Review, Monitoring"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m5 records and data integrity\n\nlook for invalidated out of specification (oos) results and assess wheper it was correct to invalidate pem\nseek out inconsistencies in manufacturing documents (e.g., identification of actual equipment used).\n\n## periodic review\n\nduring a systems periodic review, the following could be evaluated within the system audit trail as part of monitoring human behavior and the effectiveness of the technical controls:\n\n- any changes to system configuration that could impact data integrity controls. such changes should have been completed in accordance with the applicable change control procedure.\n- rationale for any deletion of data. if data was deleted as part of an archiving process, verify that the archived data is still accessible.\n- account disabling due to successive failed logons - look for repeat offenders and any timing patterns that indicate attempts at unauthorized access.\n\nsuch a review process may be practical only in a system where the audit trail can be filtered for review purposes. see appendix m4.\n\npersonnel records and system administrator logs can be reviewed for ongoing assurance of data integrity by:\n\n- checking the active user account list to ensure that only current personnel retain access to the system\n- confirming via the training records that all active personnel are adequately trained to operate the system\n- ensuring that system/database backups are happening as per the defined schedule, with the integrity of the backup being verified, and trial restoration of the system periodically occurring in a documented manner\n- ensuring that personnel are not given improper authority (e.g., enhanced access) for brief periods of time to perform improper activities\n\nother periodic review activities involve the review of:\n\n- sops\n- system records\n- sop records\n- change control\n- validation documentation\n- system performance\n\nthese activities support ongoing compliance but are out of scope of this guide. for further information on periodic review and sops see the ispe gamp(r) good practice guide: a risk-based approach to operation of gxp computerized systems [18].", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3cf6b347-89be-4061-91bd-30d2df0e7767": {"__data__": {"id_": "3cf6b347-89be-4061-91bd-30d2df0e7767", "embedding": null, "metadata": {"page_label": "85", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity through Documented Review Processes: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific section of the ISPE Records and Data Integrity Guide discusses record reviews and audit trail review?\n \n2. How does the ISPE Records and Data Integrity Guide recommend documenting the review processes within regulated companies to ensure data integrity?\n\n3. What specific statement is suggested by the ISPE Records and Data Integrity Guide to be included in the signature process for a record to certify the review of data, metadata, manually entered values, and audit trail records?", "prev_section_summary": "The section discusses the importance of ensuring data integrity through periodic review and monitoring in accordance with the ISPE GAMP\u00ae Guide: Appendix M5. Key topics include looking for inconsistencies in manufacturing documents, evaluating system audit trails for changes impacting data integrity controls, reviewing personnel records and system administrator logs for ongoing assurance of data integrity, and conducting various periodic review activities such as reviewing SOPs, system records, change control, and validation documentation. The section emphasizes the need for monitoring human behavior and the effectiveness of technical controls to maintain data integrity in pharmaceutical manufacturing processes.", "excerpt_keywords": "ISPE, Records, Data Integrity, Documented Review Processes, Audit Trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### 10.4 other reviews\n\nrecord reviews and audit trail review are covered in section 4.4.\n\n### 10.5 documenting review processes\n\nwithin regulated companies, there should be documented evidence when an action was completed and by whom. data integrity audits and system periodic reviews should be documented in their respective formal reports. reviewing audit trail entries associated with results (i.e., data audit trail) may be governed by a \"review of gxp data sop\", and documented by a statement similar to: \"by approving this report, i certify that i have reviewed the data, metadata, manually entered values, and audit trail records associated with this data, in accordance with review sop xxx.\" this statement could be included in the signature process for a record, and be visible on the printed and displayed report.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5969bf76-79c6-4918-9bb6-f22512fb29a5": {"__data__": {"id_": "5969bf76-79c6-4918-9bb6-f22512fb29a5", "embedding": null, "metadata": {"page_label": "86", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Empty Space: A Collection of Absences\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can specifically answer, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the ISPE Records and Data Integrity Guide as stored in the specified directory?**\n - This question is directly answered by the provided context, which specifies the file size as 6245344 bytes. This detail is unique to this document and its storage specifics, making it unlikely to be found in other sources.\n\n2. **What are the creation and last modification dates of the document titled \"Empty Space: A Collection of Absences\" within the PharmaWise Engineer's PharmaWise CSV & Data Integrity project?**\n - The context provides specific dates for the creation (2024-04-07) and last modification (2024-04-04) of the document. These dates are unique identifiers of the document's version and its timeline within the project, information that would be difficult to locate elsewhere.\n\n3. **Under what file name and path is the ISPE Records and Data Integrity Guide stored within the PharmaWise Engineer's project directory?**\n - The context gives a detailed file path and name ([13] ISPE Records and Data Integrity Guide.pdf) within a specific project directory (/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/). This information is highly specific to the document's storage and organization within this particular project, making it unlikely to be replicated or found in other sources.\n\nThese questions leverage the unique, detailed information provided in the context, focusing on specifics about the document's storage, versioning, and identification within a project environment.", "prev_section_summary": "The section discusses the importance of record reviews and audit trail reviews in ensuring data integrity within regulated companies. It emphasizes the need for documented evidence of completed actions and the recommendation to document data integrity audits and system periodic reviews in formal reports. The section also suggests including a specific statement in the signature process for a record to certify the review of data, metadata, manually entered values, and audit trail records.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, Absences"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "157a14f3-d44c-42b0-894b-837ddc9472a8": {"__data__": {"id_": "157a14f3-d44c-42b0-894b-837ddc9472a8", "embedding": null, "metadata": {"page_label": "87", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Inspection Readiness for Records and Data Integrity: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific guidance does the ISPE GAMP\u00ae Guide: Records and Data Integrity Appendix M6 provide regarding the preparation for regulatory inspections focused on records and data integrity?\n \n2. How does the ISPE GAMP\u00ae Guide suggest regulated companies should handle requests for electronic records during a regulatory inspection, including the specifics on providing printed and electronic copies?\n\n3. According to the ISPE GAMP\u00ae Guide: Records and Data Integrity, what considerations should regulated companies have regarding the security and preservation of electronic records provided to regulatory authorities during inspections?", "prev_section_summary": "The key topics of this section include the file details of the ISPE Records and Data Integrity Guide, such as its file size, creation and last modification dates, file name, and storage path within the PharmaWise Engineer's project directory. The section emphasizes the unique and specific information provided in the context, which pertains to the document's storage, versioning, and identification within the project environment.", "excerpt_keywords": "ISPE, GAMP, Records, Data Integrity, Inspection Readiness"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\n### appendix m6\n\n### appendix m6 - inspection readiness\n\nthis appendix provides guidance on inspection readiness specifically for the integrity of records and data.\n\n### 11 general procedures\n\nregulated companies should:\n\n- consider record and data integrity within the context of broader inspection readiness programs\n- establish and maintain policies and procedures that ensure a constant state of inspection readiness\n- have robust established procedures for all aspects of the system life cycle\n- be prepared for regulatory inspections:\n1. on the management of record and data integrity to verify the adequacy of controls\n2. using a forensic type approach which challenge the data integrity of specific records\n\n### 11.1.1 special requests\n\ncopies of electronic records should be available on request during a regulatory inspection.\n\ncopies of electronic records may be provided as:\n\n1. printed copies of electronic records:\n- printed copies should be suitably marked and signed as authorized copies of an electronic record.\n- where paper copies do not represent complete copies of an electronic record, someone in the regulated area should be prepared to have a discussion with the regulatory inspector(s) to clarify this.\n2. electronic copies of electronic records:\n- the media on which it will be stored should be agreed with the regulatory inspector(s)\n- the media should be labeled as an authorized copy\n- the media should be scanned to ensure that there are no viruses and that it is an accurate and complete copy of the requested record\n\nregulated companies should also consider whether an electronic copy of an electronic record should be password controlled so the record remains secure.\n\nregulatory authorities have the same rights to access electronic records as they do for paper records.\n\nit is not necessary to keep the superseded legacy computerized systems as long as content and meaning of a record is preserved.\n\nthe regulated company should create and retain a second copy of records and data provided to the regulatory authority, in case they need to refer to it in the future.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8f94bd99-d6a1-40c0-958a-70581ac48e1a": {"__data__": {"id_": "8f94bd99-d6a1-40c0-958a-70581ac48e1a", "embedding": null, "metadata": {"page_label": "88", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Regulatory Inspections and Data Integrity in GAMP\u00ae Guide: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What specific precautions should regulated companies take when regulatory inspectors request to take photographs of equipment and facilities, especially in areas with potentially explosive processes?\n \n2. How should regulated companies handle the situation when regulatory inspectors request direct access to computer systems to view records or workflows, in terms of data integrity and security?\n\n3. What are the recommended practices for regulated companies to prepare for regulatory inspections in terms of demonstrating control over systems, identifying responsible personnel, and establishing quality agreements, especially for global information systems and interconnected systems?", "prev_section_summary": "The section provides guidance from the ISPE GAMP\u00ae Guide on inspection readiness for records and data integrity. Key topics include general procedures for maintaining inspection readiness, handling special requests for electronic records during regulatory inspections, considerations for security and preservation of electronic records, and the importance of retaining a second copy of records provided to regulatory authorities. Entities mentioned include regulated companies, regulatory inspectors, electronic records, and legacy computerized systems.", "excerpt_keywords": "ISPE, GAMP Guide, data integrity, regulatory inspections, electronic records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m6\n\nregulatory inspectors may want to take photographs of equipment and facilities. where there are restrictions on taking photographs due to safety issues in areas with potentially explosive processes, then this should be explained at the beginning of the inspection.\n\nif photographs are taken by electronic means (e.g., digital camera or smart phone), the regulated company should try to obtain copies of the photographs taken by the regulatory inspector or take their own comparable photographs. photographs should be retained, in case the regulated company needs to refer to them in the future.\n\n### records and data integrity\n\nlegal\n\nregulators may share information and may be aware of data integrity issues before an inspection occurs. details of what has been reported to other regulatory authorities may be requested during a regulatory inspection.\n\nlegal should be engaged to confirm which information may be shared within any restrictions imposed by other regulatory authorities. regulated companies should consider whether to obtain agreement that records and data are treated as confidential business information and not used for any purpose outside the jurisdiction of the regulatory authority, without prior written consent of the regulated company.\n\n### access to computer systems\n\nregulatory inspectors may request direct access to computer systems to view records or workflows. inspectors should not be granted access that allows them to manipulate data, apply electronic authorizations or approvals, or otherwise administer workflows. regulatory inspectors could be given read only access; however, it is typically more efficient to provide trained and authorized personnel to access the computerized system and for the inspector to watch.\n\nall system interactions and data queries should be performed in accordance with established sops. regulatory inspectors should understand the context of data retrieved when running data queries. regulated companies should keep copies of records and data provided for inspection, along with the database queries and routines used to collect them. a record of the name of the preparer and reviewer of the data collected should be maintained. consideration should be given to how to protect this information from modification.\n\n### key information for regulatory inspections\n\nregulated companies should be able to demonstrate that the systems they use are fit for their intended purpose. the regulated company should be able to readily identify personnel responsible for systems and associated data, i.e., the process owner, system owner, and data steward. personnel may be based centrally at an off-site location rather than at the point of inspection. quality agreements should be established to define roles, responsibilities, and contact information where local teams rely on central organizations or third parties. for global information systems and interconnected systems, control of records and data should be demonstrable; system interfaces should also be considered.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "40fc4333-240f-447a-91a0-6bff2045d0cd": {"__data__": {"id_": "40fc4333-240f-447a-91a0-6bff2045d0cd", "embedding": null, "metadata": {"page_label": "89", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Roles and Responsibilities of Process and System Owners in Ensuring Data Integrity\"", "questions_this_excerpt_can_answer": "1. What specific responsibilities do process owners and system owners have during regulatory inspections according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest process and system owners maintain and demonstrate the integrity of system documentation, particularly in relation to validation plans, system security controls, and change control records?\n\n3. According to the ISPE Records and Data Integrity Guide, what are the key aspects of system record integrity that a process owner must be knowledgeable about and able to explain, including the operation of audit trails and the enforcement of approvals within the business process?", "prev_section_summary": "The section discusses the precautions regulated companies should take when regulatory inspectors request to take photographs of equipment and facilities, especially in areas with potentially explosive processes. It also covers how regulated companies should handle requests for direct access to computer systems by inspectors, emphasizing data integrity and security. Additionally, the section outlines recommended practices for preparing for regulatory inspections, including demonstrating control over systems, identifying responsible personnel, and establishing quality agreements, especially for global information systems and interconnected systems. Legal considerations regarding data integrity issues and sharing information with regulatory authorities are also addressed. Key entities mentioned include regulatory inspectors, legal departments, system owners, data stewards, process owners, and quality agreements.", "excerpt_keywords": "ISPE, Records, Data Integrity, Process Owners, System Owners"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### 11.2.1 process owners and system owners\n\nthe process owner and system owner are normally accountable for responding to system specific questions during regulatory inspections.\n\nprocess owners and system owners should be:\n\n- knowledgeable about the documentation supporting the implementation, control, maintenance, use, and history of the system\n- able to discuss any technical and procedural controls implemented to support the integrity of the creation, processing, and reporting of records and data\n- able to share the information about the requirements and testing of the data integrity relating to technical and procedural controls. a documented version history of the system can be created to assist in sharing such information.\n- able to discuss the key computer system documents including:\n- validation plan\n- requirements:\n- data integrity controls\n- system security controls\n- validation report\n- change control records\n\n### 11.2.2 process owners\n\nthe process owner should be knowledgeable about and able to explain:\n\n- the business processes supported by the system\n- data flows\n- any business sops supporting the process\n- system security controls\n- the validation documentation supporting the validation and use of the system\n- system record integrity including how the:\n- use sop governs the timely recording of data\n- audit trail is enabled and operating\n- records are approved/signed only by authorized users\n- approvals are enforced at specific points in the business process", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b58cf4fa-39e2-4bd9-b189-e40517a421c5": {"__data__": {"id_": "b58cf4fa-39e2-4bd9-b189-e40517a421c5", "embedding": null, "metadata": {"page_label": "90", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"System Ownership and Monitoring for Data Integrity and Compliance\"", "questions_this_excerpt_can_answer": "1. What specific practices are recommended by the ISPE GAMP\u00ae Guide: Appendix M6 for integrating audit trail review into the business process to ensure records and data integrity?\n \n2. According to the ISPE Records and Data Integrity Guide, what responsibilities should a system owner have in relation to supporting IT procedures and managing change controls within a system to maintain data integrity?\n\n3. What are the key areas of focus for robust monitoring of system, business, and IT support procedures as outlined in the ISPE Records and Data Integrity Guide to ensure inspection readiness and compliance with data integrity standards?", "prev_section_summary": "The section discusses the roles and responsibilities of process owners and system owners in ensuring data integrity according to the ISPE Records and Data Integrity Guide. Process owners and system owners are accountable for responding to system-specific questions during regulatory inspections. They should be knowledgeable about system documentation, technical and procedural controls, data integrity requirements, and key computer system documents. Process owners should also understand business processes, data flows, business SOPs, system security controls, validation documentation, and system record integrity, including the operation of audit trails and enforcement of approvals within the business process.", "excerpt_keywords": "ISPE, GAMP, data integrity, system owner, monitoring"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m6 records and data integrity\n\n- audit trail review (in accordance with risk) is integrated into the business process\n- records and data:\n- can be changed only by authorized users\n- are restricted from change at required points in the life cycle\n\n### 11.2.3 system owners\n\nthe system owner should be able to explain:\n\n- it procedures used to support the system\n- the change control process\n- change controls and associated documentation\n\n### 11.2.4 monitoring\n\nthere should be robust monitoring of the system, business, and it support procedures to ensure that the processes are adequate and are being followed. areas that should be routinely reviewed as part of monitoring to ensure inspection readiness includes:\n\n- access control:\n- access sops are in place and being followed\n- available user roles are documented and managed by change control\n- documentation supporting that only authorized and trained people have system access\n- evidence that access is periodically reviewed (by automated checks where available)\n- segregation of duties enforced\n- generic accounts are not used for data modification\n- back door changes requiring it tools and skills are authorized, verified, and documented\n- historic access records\n- backup and disaster recovery:\n- documented and verified procedures for backup, restore, disaster recovery, and record retention\n- documented evidence that records and data are periodically backed up\n- records retention policies are clearly defined and followed\n- records and data can only be accessed by authorized users (network and system)\n- archived records are secure and accessible for the retention period", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c1d5c2ba-e6e7-4313-a340-95be0d8f5f9d": {"__data__": {"id_": "c1d5c2ba-e6e7-4313-a340-95be0d8f5f9d", "embedding": null, "metadata": {"page_label": "91", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Comprehensive Guide to Ensuring Data Integrity: Personnel Training, Procedures, and Investigations\"", "questions_this_excerpt_can_answer": "1. What specific elements should be included in the preparation of personnel for regulatory inspections according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide recommend handling trends across multiple data integrity incidents, and what global actions should be considered?\n\n3. According to the ISPE Records and Data Integrity Guide, how should a risk assessment be conducted during internal data integrity investigations, and what key factors should it consider?", "prev_section_summary": "The section discusses the importance of system ownership and monitoring for data integrity and compliance according to the ISPE Records and Data Integrity Guide. Key topics include integrating audit trail review into the business process, responsibilities of system owners in supporting IT procedures and managing change controls, and robust monitoring of system, business, and IT support procedures. Entities such as authorized users, access control, backup and disaster recovery procedures, segregation of duties, and record retention policies are highlighted as essential for ensuring inspection readiness and compliance with data integrity standards.", "excerpt_keywords": "ISPE, Records, Data Integrity, Personnel Training, Investigations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nrecord and data maintenance\n\n### personnel preparedness, training records, and procedures\n\npersonnel using or supporting the system should be prepared for regulatory inspections, as well as the process owner and system owner. there should be robust processes in place to ensure that all individuals have current resumes, job descriptions, and training records.\n\nwhere there are procedures for management review of training records, there should be documented evidence supporting the review. training should ensure that personnel using or supporting computer systems understand which sops govern their roles. personnel should also be able to communicate clearly their roles and responsibilities with respect to a system.\n\nworkflows, equipment, and facilities should assure data integrity. the provision of appropriate training and supportive oversight should also assure data integrity. process and data flows can be used in risk assessments to identify where additional controls might be warranted.\n\n### internal data integrity investigations\n\nquality units may need to perform data integrity investigations including, e.g.:\n\n- deviations with incident summaries\n- root cause analysis\n- capas\n\ntrends across multiple data integrity incidents should be analyzed. global capas should be followed where there are wider organizational implications.\n\na risk assessment should form part of the investigation. it should consider risk to data and the consequences to safety, efficacy, and quality of medicinal products.\n\nregulatory inspectors may be interested in both human factors and any contributory supervision and leadership factors associated with the incidents subject to the inspection.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b145562e-7be1-4108-b091-ebd9b955c305": {"__data__": {"id_": "b145562e-7be1-4108-b091-ebd9b955c305", "embedding": null, "metadata": {"page_label": "92", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Absence\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's digital footprint within a particular storage solution, which is detailed in the context provided.\n\n2. **What is the discrepancy between the creation and last modification dates of the document titled \"Blank Canvas: A Collection of Absence\" found within the ISPE Records and Data Integrity Guide?**\n - This question addresses the document management aspect, focusing on the version control and document integrity by comparing the creation and last modification dates, which are unique to this document's lifecycle as provided in the context.\n\n3. **How does the title \"Blank Canvas: A Collection of Absence\" relate to the content of the ISPE Records and Data Integrity Guide, considering the document's title seems unrelated to the typical content expected in such a guide?**\n - This question probes into the content strategy and thematic relevance, which is peculiar given the nature of the document's title juxtaposed with the expected technical content of an ISPE guide. The answer to this would require an understanding of the document's purpose and content strategy, which is uniquely provided in the given context.\n\nThese questions are tailored to the specifics of the document's metadata and thematic presentation as provided, making them unlikely to be answered by sources outside the given context.", "prev_section_summary": "The section focuses on personnel preparedness, training records, and procedures related to data integrity, as well as internal data integrity investigations. Key topics include the importance of personnel training for regulatory inspections, the need for robust processes for maintaining training records, the role of workflows, equipment, and facilities in assuring data integrity, and the steps involved in internal data integrity investigations such as deviations, root cause analysis, and CAPAs. The section also highlights the importance of analyzing trends across multiple data integrity incidents, conducting risk assessments during investigations, and considering human factors and leadership in regulatory inspections. Key entities mentioned include personnel, process owners, system owners, quality units, and regulatory inspectors.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7ee09460-f4ff-4f37-baf7-9abba99ac13f": {"__data__": {"id_": "7ee09460-f4ff-4f37-baf7-9abba99ac13f", "embedding": null, "metadata": {"page_label": "93", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Integrating Data Integrity into Records Management Processes: A Comparative Analysis of Life Cycle Approaches", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP guide propose integrating data integrity into existing records management processes, specifically in relation to the life cycle of records and data?\n \n2. What are the specific stages of the records management life cycle as outlined in the ISPE Records and Data Integrity Guide, and how do these stages map to the data life cycle stages mentioned in the guide?\n\n3. According to the ISPE Records and Data Integrity Guide, how should data owners and quality unit personnel understand the relationship between the records management group's role and the formal role of an archivist in ensuring data integrity throughout the data and records life cycle?", "prev_section_summary": "The key topics of this section include document metadata such as file size, creation and last modification dates, and document title. The entities mentioned are the \"ISPE Records and Data Integrity Guide,\" the document titled \"Blank Canvas: A Collection of Absence,\" and the PharmaWise Engineer project on Google Drive. The section focuses on specific questions related to these entities and their digital footprint within the project, highlighting the importance of document management, version control, and content relevance.", "excerpt_keywords": "ISPE, GAMP, data integrity, records management, life cycle"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 91\n\n## records and data integrity appendix m7\n\n### appendix m7 - integrating data integrity into existing records management processes\n\n12.1 introduction\n\nrecords management is a long-established discipline. organizations such as the association of record managers and administrators (arma) [26] are dedicated to this topic, and have developed standards in association with bodies such as the american national standards institute (ansi) [27].\n\nlarge companies typically have departments that are specifically dedicated to records management. the scope of records management is larger than gxp data; it encompasses all of the records generated by a regulated company, including financial, legal, administrative, personnel, etc. the principles and standards for records management are considered compatible with data integrity.\n\nrecords management has its own established vocabulary, including a defined life cycle (see figure 12.1).\n\nrecords managers should not be expected to replace or modify established standards, terminology, and processes based on convention or additional guidance relating to a relatively small subset of the records they control, i.e., gxp regulated records.\n\nthe records life cycle does, however, map to the data life cycle discussed in this guide (see section 4) without risk to data integrity. data owners and quality unit personnel should understand this mapping and how the regulated companys records processes meet expectations for the corresponding phases. for example, they should understand whether the records management group fulfills the formal role of archivist. record managers typically apply a record life cycle that translates to paper, as well.\n\n|data life cycle|record management life cycle|\n|---|---|\n|creation|creation|\n|processing|classification occurs here|\n|review, reporting, and use|may be referred to as part of normal business process|\n|retention and retrieval|active records|\n|destruction|semi-active records|\n| |inactive records|\n| |destruction|\n| |not regularly referenced; but may be retrieved (e.g., for audits)|\n| |retained for remainder of retention period or legal hold|\n| |retention period is complete and no legal hold exists|\n\nfigure 12.1: mapping of record life cycle to data life cycle", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5588a1bb-7d78-4e7d-8cfc-320ffbc00b88": {"__data__": {"id_": "5588a1bb-7d78-4e7d-8cfc-320ffbc00b88", "embedding": null, "metadata": {"page_label": "94", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Title: \"Managing the Lifecycle of Records in Global Information Systems\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide: Appendix M7 suggest classifying records for effective management, and what criteria are recommended for determining the applicable laws and regulations?\n \n2. What are the specific considerations and processes recommended by the ISPE GAMP\u00ae Guide: Appendix M7 for managing the lifecycle of semi-active records in global information systems, including their storage and retrieval?\n\n3. According to the ISPE GAMP\u00ae Guide: Appendix M7, what factors should be considered in system retirement planning to ensure the availability of systems originally used for data analysis, especially in the context of reprocessing records during their active phase?", "prev_section_summary": "The section discusses the integration of data integrity into existing records management processes, specifically focusing on the life cycle of records and data. It highlights the importance of understanding the relationship between records management and data integrity, emphasizing that records management encompasses all records generated by a regulated company, not just GXP data. The section outlines the stages of the records management life cycle and how they map to the data life cycle stages. It also mentions the role of data owners, quality unit personnel, and archivists in ensuring data integrity throughout the records and data life cycle. The section provides a comparison between the data life cycle and the record management life cycle, showing how they align without compromising data integrity.", "excerpt_keywords": "ISPE, GAMP, records management, data integrity, global information systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix m7 records and data integrity\n\n|12.2 record creation|records should be classified (e.g., as gxp, data privacy sensitivity) in order to understand which laws and regulations apply to their management.|\n|---|---|\n|12.3 active records|active records are routinely subject to retrieval for business purposes. electronically, they will reside in the active database. paper records should be readily retrievable in a short time. for global information systems, this might involve replication to local sources in distributed systems. records may need to be reprocessed. this typically would occur during the active phase, although it could happen in the semi-active, or possibly in the inactive phase (possibly in response to an audit request). change control processes should be followed if the record is to be updated. this may affect the retention period. typically, reprocessing would require the availability of the system originally used to analyze the data. the likelihood of such a need should be a consideration for system retirement planning.|\n|12.4 semi-active records|these records can continue to be referenced for business purposes, although rarely. for example, they may be needed to support a regulatory inspection. electronically, they may reside in a near-line archive with limited access. expectations for the retrieval of semi-active records needs to be clearly defined. for global information systems, near-line archives can be local, although it is probably better to do this globally in order to minimize the number of copies of the record being managed. regulated companies may prefer to limit themselves to active and inactive, rather than use a semi-active stage in their life cycle.|\n|12.5 inactive records|most archived records fall into this category. these records are unlikely to be retrieved, but are being held to conform to retention policy. for global information systems, it is recommended that one archive should be managed globally, which will make the eventual destruction of a record simpler.|\n|12.5.1 destruction|this stage of the record life cycle is effectively the same as for the data life cycle.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "296a20bf-8a0f-4e1e-a05b-06636a44baa2": {"__data__": {"id_": "296a20bf-8a0f-4e1e-a05b-06636a44baa2", "embedding": null, "metadata": {"page_label": "95", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Requirements for GXP Regulated Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest integrating business process understanding and regulatory assessment into the system validation process for GXP regulated computerized systems?\n \n2. What specific guidance does Appendix D1 of the ISPE Records and Data Integrity Guide provide regarding the establishment and definition of data integrity requirements for new and existing GXP regulated computerized systems?\n\n3. How does the example of a change management enterprise system business process workflow in the guide illustrate the process of deriving user requirements from business processes to ensure data integrity and regulatory compliance?", "prev_section_summary": "The section discusses the management of records in global information systems according to the ISPE GAMP\u00ae Guide: Appendix M7. Key topics include the classification of records for effective management, the lifecycle of active, semi-active, and inactive records, system retirement planning, and the destruction of records. Entities mentioned include active records, semi-active records, inactive records, change control processes, near-line archives, and retention policies. The section emphasizes the importance of understanding applicable laws and regulations, ensuring the availability of systems for data analysis, and defining expectations for the retrieval of records.", "excerpt_keywords": "ISPE, GAMP, data integrity, computerized systems, regulatory compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 93\n\n### records and data integrity appendix d1\n\n#### appendix d1 - user requirements\n\n13.1 introduction\n\nthis appendix provides specific guidance on establishing and defining data integrity requirements for new and existing gxp regulated computerized systems. this appendix supports ispe gamp(r) 5 [3], which provides general guidance on the contents and production of a urs.\n\nurss should describe the required functions of the computerized system and be based on a documented risk assessment. urss should be linked to the defined business process workflows and regulatory requirements governing the data and records in the system.\n\ngxp requirements and data should be identified and documented to support appropriate quality risk management throughout the system life cycle. requirements should be unambiguous and testable.\n\n13.2 business process\n\nuser requirements should accurately reflect business process and data workflows, in order to establish a computerized system which meets its intended use. business process understanding and regulatory assessment should drive the system validation from the initial user requirements to its functional and design requirements through qualification, procedural controls, system release and continued use. a consistent and complete urs should be produced to help ensure successful validation and compliance.\n\nbusiness processes and associated data should be documented, e.g., through definition of business processes and/or data workflows, to ensure that a system adequately addresses all data integrity concerns necessary to meet regulatory requirements and expectations. for further information on defining business process flows and data flow diagrams, see appendix d2.\n\nfigure 13.1 provides an example of a change management enterprise system business process workflow and shows how potential user requirements may be derived from it.\n\nlaying out the business process workflow can assist the stakeholders in identifying and agreeing the roles, records, signature requirements, system functionality, etc., necessary to support the system for its intended use.\n\npotential failures can also be assessed and remediated prior to selecting, designing, or establishing the system; therefore, saving resources, time, and money. in this example the business process workflow is broken into user requirements that further clarify how the system is intended to be used.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7f27d2df-911a-4d29-bce2-95cb0f909bc1": {"__data__": {"id_": "7f27d2df-911a-4d29-bce2-95cb0f909bc1", "embedding": null, "metadata": {"page_label": "96", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Guidelines for Change Management and Data Integrity Requirements in ISPE GAMP\u00ae Guide\"", "questions_this_excerpt_can_answer": "1. What are the specific stages outlined in the ISPE GAMP\u00ae Guide for managing system changes within the context of records and data integrity, and how does each stage contribute to ensuring the integrity of the change management process?\n\n2. How does the ISPE GAMP\u00ae Guide recommend handling the documentation and approval process for system changes to ensure compliance with data integrity standards, particularly in terms of roles and responsibilities of change initiators and quality assurance personnel?\n\n3. What general data integrity related requirements are mentioned in the ISPE GAMP\u00ae Guide that should be addressed by standard computerized system validation, operation, and compliance processes, and how are these requirements suggested to be integrated into a User Requirements Specification (URS) without directly copying the text verbatim?", "prev_section_summary": "The section discusses the establishment and definition of data integrity requirements for GXP regulated computerized systems, with a focus on user requirements and business processes. Key topics include the importance of accurately reflecting business processes in user requirements, linking requirements to regulatory requirements, and the use of business process workflows to derive user requirements for system design. The section emphasizes the need for unambiguous and testable requirements to support quality risk management throughout the system life cycle. Additionally, the section provides guidance on documenting business processes and data workflows to ensure data integrity and regulatory compliance.", "excerpt_keywords": "ISPE, GAMP, data integrity, change management, system validation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d1\n\nfigure 13.1: potential user requirements derived from a business process workflow\nrecords and data integrity\n\nchange management enterprise system\n\n|change planning|change assessment|change pre-approval|change implementation|change post-approval|\n|---|---|---|---|---|\n|initiate a system change| |no quality approval?|implement approved change|assure change implemented and documented per approved change plan|\n|document the change plan|identify smes to assess change|yes quality approval|the change initiator only shall be able to add printedevidence to the change, providing evidence that the change has been implemented per the change plan| |\n|sme 1 assessment|quality approval|no quality approval?| |quality approval change closed|\n|sme 2 assessment|quality approval|yes quality approval| | |\n\nchange management access system\n\n- system shall be granted to only authorized and trained users\n- change initiator shall have rights to document proposed change but not approve it\n- the change management system shall allow the change initiator to document the planned change.\n- the change management system shall send automatic notification to identified smes that a change is ready for assessment;\n- the change management system shall allow only the quality role to approve or reject a change.\n\ngeneral data integrity requirements\n\ngeneral data integrity related requirements and expectations may be addressed by processes or procedures that are part of standard computerized system validation, operation, and compliance (e.g., backups, ensuring individuals understand the accountability for their electronic signatures) or business processes and may not be included in a urs.\n\nnote: the text in the following tables should not be copied and included verbatim into a urs. content should be converted into specific and testable requirements, taking into account the context and use of the system. regulations and business rules governing a process should be considered, and relevant requirements should be documented in the urs.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "64f02ceb-b807-45de-b299-1f01f7caefba": {"__data__": {"id_": "64f02ceb-b807-45de-b299-1f01f7caefba", "embedding": null, "metadata": {"page_label": "97", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Security Controls in Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific security measures does the ISPE GAMP\u00ae guide recommend for restricting physical access to computerized systems in pharmaceutical environments, and how should changes in access authorizations be documented?\n\n2. According to the ISPE Records and Data Integrity Guide, how should a computerized system ensure the integrity of records throughout their lifecycle, and what are the requirements for generating copies of GxP electronic records for agency inspection?\n\n3. What are the detailed requirements outlined by the ISPE guide for maintaining data integrity during the data retention period, including the process for regular backups, data retention enforcement mechanisms, and the validation of backup data integrity?", "prev_section_summary": "The section discusses the guidelines for change management and data integrity requirements in the ISPE GAMP\u00ae Guide. It outlines specific stages for managing system changes, including change planning, assessment, pre-approval, implementation, and post-approval. The roles and responsibilities of change initiators and quality assurance personnel are highlighted, emphasizing the need for documentation and approval processes to ensure compliance with data integrity standards. The section also addresses general data integrity requirements that should be integrated into standard computerized system validation, operation, and compliance processes, without directly copying text into a User Requirements Specification (URS). Key topics include change management processes, access system requirements, and general data integrity expectations.", "excerpt_keywords": "ISPE, GAMP, data integrity, computerized systems, security controls"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 95\n\n### records and data integrity appendix d1\n\n|technical requirements|\n|---|\n|#|requirement|\n|1|the system should employ logical controls to restrict access to authorized persons. the extent of security controls depends on the criticality of the computerized system. the system should use authority checks to ensure that only authorized individuals can use the system, electronically sign a record, access the operation or computer system input or output device, alter a record, or perform the operation at hand [2]. the system should have access controls to ensure that personnel have access only to functionality that is appropriate for their job role, and that actions are attributable to a specific individual.|\n|2|suitable control methods for preventing unauthorized physical access to the system should be employed e.g., computer hardware, communications equipment, peripheral components and electronic storage media. controls may include the use of keys, pass cards, personal codes with passwords, biometrics, or restricted access to specific computer equipment (e.g., data storage areas, interfaces, computers, server rooms). creation, change, and cancellation of access authorizations should be recorded.|\n|3|the system should ensure that the accuracy, completeness, content, and meaning of data is retained throughout the data life cycle. original records and true copies should preserve the integrity (accuracy, completeness, content, and meaning) of the record.|\n|4|the system should be able to generate accurate and complete copies of gxp electronic records in both human readable and electronic form suitable for inspection, review, and copying by the agency [2].|\n|5|access to the system should be via individual login credentials made up of a unique combination of user id and password. pass through technologies such as single sign on that leverage earlier user authentication are acceptable.|\n|6|the system should provide a mechanism to archive complete and accurate records, including relevant metadata, from the system. the records should continue to be protected from deliberate or inadvertent loss, damage and/or alteration for the retention period. security controls should be in place to ensure the data integrity of the record throughout the retention period, and validated where appropriate.|\n|7|the computer system should provide a process for regular backups of all data including relevant metadata. note: the urs should include details as to the frequency of backup, the nature of the backup (full/incremental), and the length of time the backups are retained. the integrity and accuracy of backup data, and the ability to restore the data, should be checked during validation and monitored periodically [7].|\n|8|the system should provide a mechanism to enforce data retention requirements, including data ownership, data holds (regulatory holds), and destruction of data. stored data should be verified for restorability, accessibility, readability and accuracy throughout the retention period [7].|\n|9|where appropriate, operational system checks should enforce permitted sequencing of gxp steps and events, and should disallow non-permitted sequencing of gxp steps and events [2].|\n|10|computerized systems exchanging data electronically with other systems should include appropriate built-in checks for the correct and secure entry and processing of data, in order to minimize the risks [7].|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "79cb2315-52bd-4428-a60d-b41ea629d01d": {"__data__": {"id_": "79cb2315-52bd-4428-a60d-b41ea629d01d", "embedding": null, "metadata": {"page_label": "98", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Electronic Record Management in ISPE GAMP(r) Guide: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What specific features should a computer system have to ensure the integrity and security of manually entered data according to the ISPE GAMP(r) Guide's appendix D1?\n \n2. How does the ISPE GAMP(r) Guide outline the requirements for electronic signatures to prevent their misuse or falsification in electronic record management systems?\n\n3. What procedural requirements are outlined in the ISPE GAMP(r) Guide for personnel involved in the development, maintenance, or use of electronic record/electronic signature systems to ensure compliance with GxP regulations?", "prev_section_summary": "The section discusses the technical requirements outlined by the ISPE GAMP\u00ae guide for ensuring data integrity and security controls in computerized systems within pharmaceutical environments. Key topics include restricting physical and logical access to authorized personnel, generating accurate and complete copies of electronic records for agency inspection, maintaining data integrity throughout the data lifecycle, implementing security controls for data retention, conducting regular backups, enforcing data retention requirements, and ensuring the accuracy and security of data exchange between computerized systems. Key entities mentioned include access controls, authorization checks, user login credentials, metadata, data retention mechanisms, backup processes, data ownership, regulatory holds, data destruction, and data exchange protocols.", "excerpt_keywords": "ISPE, GAMP, data integrity, electronic signatures, procedural requirements"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d1 records and data integrity\n\n|requirement|\n|---|\n|11|the system should perform an accuracy check on manually entered data.|\n|12|the system should provide a secure, computer generated, time stamped audit trail to independently record the date and time of entries and actions that create, modify, or delete electronic records. record changes shall not obscure previously recorded information [2]. the system should record the identity of operators entering or confirming critical data. any modification to an entry of critical data should be recorded with the reason for the change.|\n|13|the system should provide audit trails that are available and convertible to a human readable form [7].|\n|14|the system should enable review of audit trails that capture changes to critical data, e.g., as part of the review of their associated records.|\n|15|the computer system should ensure that electronic signatures, including the human readable display or format, captured by the system include [2]: 1. printed name of the signer 2. date and time when signature executed 3. meaning associated with the signature|\n|16|electronic signatures and handwritten signatures executed to electronic records should be linked to their respective electronic records to ensure that the signatures cannot be excised, copied, or otherwise transferred to falsify an electronic record by ordinary means [2].|\n|17|the system should use at least two distinct identification components such as an identification code and password to ensure that electronic signatures can only be used by their genuine owners [2]. the system should support that the human readable form of an electronic signature for display or print out should be unique to an individual.|\n|18|the computer system should ensure that when an individual executes a series of signings during a single, continuous period of controlled system access, the first signing shall be executed using all electronic signature components; subsequent signings shall be executed using at least one electronic signature component that is only executable by, and designed to be used only by, the individual [2].|\n|19|the computer system should ensure that when an individual executes one or more signings not performed during a single, continuous period of controlled access, each signing shall be executed using all of the electronic signature components [2].|\n\n## procedural requirements\n\n|requirement|\n|---|\n|1|all personnel should have appropriate qualifications, level of access, and defined responsibilities to carry out their assigned duties [7].|\n|2|evidence should be available to demonstrate that persons who develop, maintain, or use electronic record/electronic signature systems have the education, training, and experience to perform their assigned tasks [2].|\n|3|gxp electronic records created, processed, stored, or reported should be identified. the system should be able to generate accurate and complete copies of gxp electronic records in both human readable and electronic form suitable for inspection, review, and copying [2].|\n|4|procedures should be established to check stored data for accessibility, durability, and accuracy.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b2315260-a877-4be4-82c8-9f63a64d87c5": {"__data__": {"id_": "b2315260-a877-4be4-82c8-9f63a64d87c5", "embedding": null, "metadata": {"page_label": "99", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Electronic Signature Procedures Document", "questions_this_excerpt_can_answer": "1. What specific measures does the ISPE Records and Data Integrity Guide recommend for managing computerized system configuration settings to ensure their integrity and security from unauthorized access?\n \n2. How does the ISPE Records and Data Integrity Guide propose to maintain the accountability and responsibility of individuals for actions initiated under their electronic signatures, and what procedures should be established to ensure the security and uniqueness of these electronic signatures?\n\n3. According to the ISPE Records and Data Integrity Guide, what are the recommended practices for the review and approval of critical changes with data integrity implications, such as system access changes and data deletion, especially those performed under system administrator access?", "prev_section_summary": "The section discusses the requirements outlined in the ISPE GAMP(r) Guide for ensuring data integrity and electronic record management. Key topics include accuracy checks on manually entered data, secure audit trails with time stamps and operator identities, availability of audit trails in human-readable form, review of audit trails capturing changes to critical data, requirements for electronic signatures to prevent misuse or falsification, and procedural requirements for personnel involved in electronic record/electronic signature systems. Entities mentioned include the system, operators, electronic signatures, identification components, and personnel with appropriate qualifications and responsibilities.", "excerpt_keywords": "ISPE, Records, Data Integrity, Electronic Signatures, Computerized System Configuration"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix d1\n\n|requirement #|requirement|\n|---|---|\n|5|validation documentation and reports should cover the relevant steps of the life cycle and should include operational change control records (if applicable) and reports on any deviations observed during the validation process [7]. regulated companies should be able to justify their standards, protocols, acceptance criteria, procedures, and records based on their risk assessment [7].|\n|6|computerized system configuration settings should be defined, tested as part of computer system validation, and protected from unauthorized access. they should be managed under change control.|\n|7|procedures should be established for an additional check on the accuracy of the record when critical data are being entered manually [7].|\n|8|procedures should be established to ensure that only authorized personnel can amend entered data.|\n|9|audit trail information should be retained for a period at least as long as that required for the subject electronic records and should be available for regulatory review and copying [2].|\n|10|based upon risk, procedures should be established to review audit trails with each critical record, and before final approval of the record.|\n|11|system access records should be periodically reviewed based upon the criticality of the process supported by the computerized system.|\n|12|system administrator access should be restricted to the minimum number of personnel possible, taking account of the size and nature of the regulated company. personnel with system administrator access should log in under unique logins that allow actions in the audit trail(s) to be attributed to a specific individual. the generic system administrator account should not be available for use [8].|\n|13|critical changes with data integrity implications (e.g., system access changes, configuration changes, data movement, data deletion etc.) performed under system administrator access should be visible to, and approved within, the quality system.|\n|14|business areas should ensure individuals understand that they are accountable and responsible for actions initiated under their electronic signatures [2], and that electronic signature components should not be made known to others.|\n|15|procedures should be established to ensure that electronic signatures have the same impact as handwritten signatures. the consequences of misuse or falsification should be documented.|\n|16|a procedure should be established to ensure that the identity of the individual is verified prior to the assignment of their electronic signature [2], or any element of an electronic signature (such as the user id).|\n|17|procedures should be established to ensure that electronic signatures are unique to one individual and not reused or reassigned [2].|\n|18|procedures should be established to maintain the uniqueness of each combined identification code and password, such that no two individuals have the same combination of identification code and password [2].|\n|19|processes should be established to ensure that the attempted use of an individuals electronic signature by anyone other than its genuine owner requires collaboration of two or more individuals [2].|\n|20|a procedure should be in place to ensure that the linkage of handwritten signatures to electronic records is maintained throughout the retention period.|\n|21|procedures should be established to perform periodic testing of devices that bear or generate the confidential component of an electronic signature to ensure that they function properly and have not been altered in an unauthorized manner [2].|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "152f6ac3-d28f-4eab-b030-9d05f81021ab": {"__data__": {"id_": "152f6ac3-d28f-4eab-b030-9d05f81021ab", "embedding": null, "metadata": {"page_label": "100", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Security and Integrity Protocols for Password Management and Signature Delegation", "questions_this_excerpt_can_answer": "1. What specific procedures does the ISPE GAMP\u00ae Guide Appendix D1 recommend for managing the expiration of passwords within the context of records and data integrity?\n \n2. How does the ISPE GAMP\u00ae Guide Appendix D1 suggest handling the delegation of electronic signature responsibilities during periods such as absences or holidays to maintain data integrity?\n\n3. According to the ISPE GAMP\u00ae Guide Appendix D1, what measures should be taken to address the issue of lost, stolen, or compromised devices that are used for generating identification codes or passwords, to ensure the continued integrity of records and data?", "prev_section_summary": "The section focuses on the requirements and best practices for maintaining records and data integrity in regulated companies, as outlined in the ISPE Records and Data Integrity Guide. Key topics include validation documentation, computerized system configuration settings, audit trail retention, system access controls, electronic signatures, and procedures for ensuring data integrity. Entities mentioned include regulated companies, personnel with system administrator access, individuals using electronic signatures, and devices generating electronic signatures. The section emphasizes the importance of accountability, security, and uniqueness in managing electronic records and signatures to ensure data integrity.", "excerpt_keywords": "ISPE, GAMP, data integrity, password management, signature delegation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d1\n\n|requirement|records and data integrity|\n|---|---|\n|22|password aging procedures should ensure that identification code and password issuances are periodically checked, recalled, or revised [2].|\n|23|password expiry procedures should be established.|\n|24|procedures should ensure that the ability to apply electronic signatures is withdrawn for individuals whose responsibilities change, without the loss of information relating to signatures already executed.|\n|25|loss management procedures should be established to electronically deactivate lost, stolen, missing, or otherwise potentially compromised tokens, cards, and other devices that bear or generate identification code or password information, and to issue temporary or permanent replacements using suitable, rigorous controls [2].|\n|26|procedures should cover the method of delegating signature responsibilities (e.g., periods of absence, holidays).|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "40b4edd4-40c6-4a24-b378-63d0e8220a5a": {"__data__": {"id_": "40b4edd4-40c6-4a24-b378-63d0e8220a5a", "embedding": null, "metadata": {"page_label": "101", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Process Mapping and Interfaces in ISPE GAMP(r) Guide: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What are the two primary visual tools mentioned in the ISPE GAMP(r) Guide for analyzing relationships within a business activity, specifically in the context of process mapping and interfaces?\n \n2. How do the business process flowcharts and data flow diagrams differ in terms of what they identify within the process mapping and interfaces as outlined in the ISPE GAMP(r) Guide?\n\n3. Can you describe the purpose and content of a hypothetical level 1 flowchart example provided in the ISPE GAMP(r) Guide, particularly in relation to supporting IT infrastructure processes?", "prev_section_summary": "The section discusses the ISPE GAMP\u00ae Guide Appendix D1 recommendations for managing password expiration, handling electronic signature delegation, and addressing lost or compromised devices to ensure data security and integrity. Key topics include password aging procedures, password expiry procedures, withdrawal of electronic signature abilities, loss management procedures for compromised devices, and delegation of signature responsibilities during absences or holidays. Key entities mentioned include identification codes, passwords, electronic signatures, tokens, cards, and devices used for generating password information.", "excerpt_keywords": "ISPE, GAMP, process mapping, data integrity, business activities"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 99\n\n### records and data integrity appendix d2\n\n#### appendix d2 - process mapping and interfaces\n\n14.1 introduction\n\nprocess workflow and dataflow diagrams are visual tools to show relationships of a business activity, including the creation and/or movement of data through a business activity and/or relationships between entities (interfaces). visual tools permit whole systems/processes to be analyzed in ways that would otherwise be difficult to achieve with text alone. it can be difficult to understand a process adequately without process workflows and relationship diagrams, including data flow diagrams across infrastructure, especially when enterprise level systems are involved. two commonplace tools used are:\n\n1. business process flowcharts, which identify:\n- business activities and decision points\n2. data flow diagrams, which identify:\n- the creation, movement, use, and archiving of data throughout a process\n\nboth the business process flowchart and dataflow diagram may be implemented in layers, with level 1 giving the most abstract \"high level\" view and levels 2, 3 etc., giving progressively more details about the process or data under consideration.\n\n14.2 process flowcharts\n\nprocess flowcharts illustrate the discrete steps of a business process; the \"process\" view of activities. this includes actions, decision points, and subprocesses. figure 14.1 provides an example of a hypothetical level 1 flowchart of a supporting it infrastructure process, granting a user access to a business computer system. this flowchart provides the steps and relationships between steps that people in the business would perform. this example was chosen as a simple illustration only, and is not intended to suggest that the process shown is a gxp regulated business process.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "682dd507-d1be-4d50-927b-96487a76f149": {"__data__": {"id_": "682dd507-d1be-4d50-927b-96487a76f149", "embedding": null, "metadata": {"page_label": "102", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Improving Process Understanding and Risk Management in IT Infrastructure Access Granting Process through Flowchart/Table Combinations", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest incorporating additional dimensions, such as user roles or locations, into a basic IT infrastructure process flowchart for granting access to a computer system?\n\n2. What method does the ISPE GAMP\u00ae Guide recommend for providing detailed explanations of each step in an IT infrastructure process flowchart, specifically in the context of granting access to a computer system?\n\n3. How can flowchart/table combinations, as outlined in the ISPE GAMP\u00ae Guide, facilitate the application of risk management in the process of granting access to IT infrastructure, and what specific metadata can be associated with points in the process to support a risk-based control strategy?", "prev_section_summary": "The section discusses the importance of process mapping and interfaces in the context of the ISPE GAMP(r) Guide. It highlights the use of visual tools such as business process flowcharts and data flow diagrams to analyze relationships within a business activity, specifically focusing on the creation and movement of data and relationships between entities. The section explains the differences between business process flowcharts and data flow diagrams, emphasizing that they can be implemented in layers to provide varying levels of detail. A hypothetical level 1 flowchart example of a supporting IT infrastructure process is provided to illustrate the concept. The section emphasizes the significance of process workflows and relationship diagrams in understanding complex processes, especially in enterprise-level systems.", "excerpt_keywords": "ISPE, GAMP Guide, IT infrastructure, process mapping, risk management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d2\n\n### figure 14.1: example it infrastructure process: granting access to a computer system\n\n|access change|complete|obtain access request form|submit approval action|action completed in system|\n|---|---|---|---|---|\n|is needed| | | | |\n|end access| | | | |\n|change| | | | |\n\nnote: in this first flowchart user roles, locations, or details are not specified. this flowchart can be extended by using lanes (also called swim lanes) that add an additional dimension to the flowchart, such as the role that should perform the action, or the location.\n\nanother approach, shown in figure 14.2, is to create a table that provides details to explain each step of the flowchart in greater detail. for example, each action is numbered, and corresponding numbers in a table can provide details such as the:\n\n- location (where)\n- responsible person/role (who)\n- proper time to perform the action (when)\n- output(s) of the action\n\nflowchart/table combinations can help in understanding a process, and allow users to associate useful metadata with specific points. risk management can also be applied at these points to form the basis of a risk-based control strategy.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2f10a780-6d68-480c-95e8-4b48001aa801": {"__data__": {"id_": "2f10a780-6d68-480c-95e8-4b48001aa801", "embedding": null, "metadata": {"page_label": "103", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Access Change Request and Approval Process Flowchart", "questions_this_excerpt_can_answer": "1. How does the Access Change Request and Approval Process specifically incorporate the concept of records and data integrity within its procedural flowchart, as outlined in the ISPE Records and Data Integrity Guide?\n\n2. What is the rationale behind structuring the Access Change Request and Approval Process flowchart into multiple levels, and how does each level cater to different audiences within an organization, according to the document?\n\n3. Can you detail the specific roles and responsibilities assigned at each step of the Access Change Request and Approval Process, as defined in the ISPE Records and Data Integrity Guide, and how these roles contribute to maintaining data integrity?", "prev_section_summary": "The section discusses the use of flowchart/table combinations in the context of improving process understanding and risk management in the IT infrastructure access granting process. Key topics include incorporating additional dimensions such as user roles or locations into a basic IT infrastructure process flowchart, providing detailed explanations of each step in the flowchart, and facilitating the application of risk management through the association of metadata with specific points in the process. The section also mentions the use of lanes (swim lanes) to add dimensions to the flowchart and the creation of a table to provide detailed information for each step of the process.", "excerpt_keywords": "ISPE, Records, Data Integrity, Access Change Request, Approval Process"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nfigure 14.2: example process flowchart (with additional details, linked by step numbers)\n\n|access change is needed|complete access request form|obtain approval|submit action|\n|---|---|---|---|\n|end access change|notifications sent to person, system owner|access|completed in system|\n|who|line supervisor|system owner|access training leader|\n|when|upon request|user completes form|once user approves|after owner approves|\n|where|access website|access website|access website|access website|\n| |(email message)| | | |\n|for large and/or complex systems it may be more efficient to flowchart the process in several levels. for example:| | | |\n|level 1 flowchart shows the entire system as a single box, showing its interfaces to other large systems.| | | |\n|level 2 flowchart shows interfaces between the system and one or several other systems, in greater detail, e.g., the interface with the financial system and its associated product library with standard costs.| | | |\n|process flowcharts detailed in this multiple level format can be used to provide information at the level needed by a specific audience, e.g.:| | | |\n|level 1: intended for senior executive| | | |\n|level 2: intended for system support personnel| | | |\n|levels can be added, as needed, e.g., if level 2 does not have sufficient detail for system understanding, another layer of depth may be necessary (e.g., level 3). the goal should be to create the minimum number of levels needed to define requirements, identify decision points and risks, and support the system in operation.| | | |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a2f918da-bd69-40f8-8cda-3dedefe1ec37": {"__data__": {"id_": "a2f918da-bd69-40f8-8cda-3dedefe1ec37", "embedding": null, "metadata": {"page_label": "104", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Flow Diagrams for Process and Data Understanding in Business Processes", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest incorporating data integrity considerations into process flowcharts, specifically in the context of creating and managing training plans?\n \n2. What methodology does the ISPE Records and Data Integrity Guide recommend for visually representing the movement and impact of data elements throughout a business process, particularly in systems of moderate complexity?\n\n3. According to the document, what are the key benefits of creating data flow diagrams in parallel with business process flowcharts, and how does this approach facilitate the understanding of both process and data views for support personnel?", "prev_section_summary": "The section discusses the Access Change Request and Approval Process flowchart in relation to records and data integrity, as outlined in the ISPE Records and Data Integrity Guide. It explains the process flowchart structure with multiple levels catering to different audiences within an organization. Specific roles and responsibilities at each step of the process are detailed, emphasizing their contribution to maintaining data integrity. The section also mentions the efficiency of using multiple levels in flowcharting complex systems and the importance of providing information at the appropriate level for different stakeholders.", "excerpt_keywords": "ISPE, GAMP, data integrity, process flowcharts, data flow diagrams"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d2 records and data integrity\n\nfigure 14.3: example process flowchart (taken to a greater level of detail)\n\n|obtain approval|add course to training plan|\n|---|---|\n|creating training plan|yes|\n|change access is needed|complete access request form|\n|system owner approval|new user?|\n|no action needed|review access request|\n|execution approval|submitted for review|\n|approved form|lookup people to notify|\n|archived notifications to person, system owner|send notification message|\n|end access change| |\n\ndata flow diagrams\n\ndata flow diagrams should graphically illustrate the creation, use, and movement of data elements throughout a business process: the \"data\" view of activities. data flow diagrams should display data elements (fields, tables, or databases) that are impacted at each step. data flow diagrams should also use images similar to those in process flowcharts to illustrate actions and decisions.\n\nfor simple systems, a hybrid may be created that combines both process and data in a single flow diagram. for systems of moderate complexity, a business process flowchart may be created, along with an identical data flow diagram re-labelled with data rather than process activities. this approach can allow support personnel to see both process and data views in parallel, to help process understanding.\n\ndata flow diagrams can be useful for identifying:\n\n- data impacted by activities\n- data elements required by regulations\n- data that can be reprocessed or modified (therefore requiring an audit trail)\n- data that is necessary for correct decisions", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c35dd562-e75b-4146-9650-6a7cfe89cba3": {"__data__": {"id_": "c35dd562-e75b-4146-9650-6a7cfe89cba3", "embedding": null, "metadata": {"page_label": "105", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Flow Diagrams and Process Charting for Data Integrity and Access Control: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE Records and Data Integrity Guide suggest using data flow diagrams to enhance understanding and management of access control within computer systems?\n \n2. What specific advantages does the guide list for using data flow diagrams over textual descriptions in the context of understanding relationships between steps in data processes and systems?\n\n3. According to the guide, under what circumstances is the need for detailed documentation, such as process flowcharts, increased, and how does this relate to the complexity and size of the system as well as the connections to other systems like inventory, planning, financials, control, and historian systems?", "prev_section_summary": "The section discusses the importance of incorporating data integrity considerations into process flowcharts, particularly in the context of creating and managing training plans. It recommends using data flow diagrams to visually represent the movement and impact of data elements throughout a business process, especially in systems of moderate complexity. The key benefits of creating data flow diagrams in parallel with business process flowcharts are highlighted, as this approach facilitates the understanding of both process and data views for support personnel. The section also outlines the key aspects that data flow diagrams can help identify, such as data impacted by activities, data elements required by regulations, and data necessary for correct decisions.", "excerpt_keywords": "ISPE, Records, Data Integrity, Data Flow Diagrams, Access Control"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nfigure 14.4: example data flow diagram for granting access to a computer system\n\n|approval|review|\n|---|---|\n|e-signature|access request|\n|system admin|e-mail notification|\n|access change is needed|access requested role|\n|access requests|username|\n|date needed|request complete and correct?|\n|system username|admin name|\n|requestor|system owner|\n|date requested|requestor|\n|advantages of data flow diagrams:| |\n|better than text for understanding relationships between steps (data process, etc.)| |\n|show decision points| |\n|illustrate inputs and outputs for each step| |\n|illustrate links between different processes or systems| |\n\nhow much is needed?\n\nas the system size, complexity, and support level increases, the greater the need to document in increasing detail.\n\nin practice, personnel performing a routine process daily/weekly are usually able to identify inherent risks to data integrity and decision points when given a business process flowchart; consequently, a process flowchart can be sufficient.\n\nin contrast, e.g., an electronic batch system with connections to inventory, planning, financials, control, and historian systems can require several process flowcharts. it may need a process flowchart for each listed connection, and two levels of flowcharts for each process, and, possibly, more levels for some areas.\n\nprocess charting should be sufficient to assist personnel in:\n\n- accomplishing process definition and understanding\n- identifying data integrity risks\n- identifying critical decision points\n- risk identification", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d2548140-6d4b-4fd6-8849-91d4e70cb760": {"__data__": {"id_": "d2548140-6d4b-4fd6-8849-91d4e70cb760", "embedding": null, "metadata": {"page_label": "106", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Absence and Presence\"", "questions_this_excerpt_can_answer": "Given the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the \"ISPE Records and Data Integrity Guide\" as stored in the PharmaWise Engineer project within the PharmaWise CSV & Data Integrity directory?**\n - This question is specific to the document's storage details within a particular project and directory, information that would be unique to this context.\n\n2. **What are the creation and last modified dates of the document titled \"Blank Canvas: A Collection of Absence and Presence\" found within the ISPE Records and Data Integrity Guide PDF?**\n - This question seeks information on the document's version control and update history, which is specific to this file and its management within the project.\n\n3. **Why might the document titled \"Blank Canvas: A Collection of Absence and Presence\" have no content in the provided excerpt, despite being part of the ISPE Records and Data Integrity Guide?**\n - This question probes into the reasoning or significance behind the absence of content in the excerpt, which could relate to the document's purpose, structure, or thematic elements specific to its role within the guide. This question assumes that the surrounding context or the document itself might provide insights into why an excerpt would intentionally be left blank, which could be a unique feature or teaching point within the guide.\n\nThese questions are tailored to elicit information that is uniquely available from the provided context, focusing on the document's physical attributes, versioning details, and content strategy within a specific project environment.", "prev_section_summary": "The section discusses the use of data flow diagrams and process charting for enhancing data integrity and access control within computer systems. It provides an example data flow diagram for granting access to a computer system and lists advantages of using data flow diagrams over textual descriptions. The section also highlights the importance of detailed documentation, such as process flowcharts, based on the size, complexity, and connections of the system to other systems like inventory, planning, financials, control, and historian systems. It emphasizes the role of process charting in defining processes, identifying data integrity risks, and critical decision points.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1a8930bf-4fd9-4a74-af3f-4580e95780da": {"__data__": {"id_": "1a8930bf-4fd9-4a74-af3f-4580e95780da", "embedding": null, "metadata": {"page_label": "107", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Guidelines for Controls and Management of Records, Data, and Electronic Signatures in ISPE GAMP(r) Guide\"", "questions_this_excerpt_can_answer": "1. What are the different levels at which controls for records and data integrity can be applied according to the ISPE GAMP(r) Guide, and what types of controls are discussed?\n \n2. How does the ISPE GAMP(r) Guide suggest managing the risks associated with records, data, and electronic signatures, and what does it recommend regarding the implementation and documentation of selected controls?\n\n3. What is the definition of a regulated electronic signature as per the ISPE GAMP(r) Guide, and how does it relate to compliance with GxP regulations?", "prev_section_summary": "The key topics and entities of this section include details about a specific document titled \"Blank Canvas: A Collection of Absence and Presence\" within the ISPE Records and Data Integrity Guide PDF. The section provides information on the file size, creation date, and last modified date of the document, as well as raises questions about the absence of content in the provided excerpt and the potential reasons behind it. The focus is on understanding the unique storage details, version control history, and content strategy of this particular document within a project environment.", "excerpt_keywords": "ISPE, GAMP, records, data integrity, electronic signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 105\n\n### records and data integrity appendix d3\n\n### appendix d3 - risk control measures for records, data, and electronic signatures\n\n15.1 introduction\n\nthis appendix discusses principles and controls for records and data and the application of electronic signatures. it describes various control measures that can be used to manage identified risks. the control measures should be aimed at eliminating or reducing the probability of occurrence of the harm, reducing the severity of harm, or increasing the probability of detection. the rigor and extent of controls will depend upon the identified risks to records and data. for further information on controls for paper records and hybrid situations, see appendix o2.\n\n15.2 record and data controls\n\ncontrols may be applied at different levels including, e.g.:\n\n- organizational\n- infrastructure\n- system\n- database\n- record\n- field\n\ncontrols may be behavioral, procedural, or technical in nature. this appendix describes procedural and technical controls that can reduce risks to an acceptable level. for further information on behavioral controls, see section 3.4 and appendix m3. a combination of procedural and technical controls may be necessary to adequately manage identified risks. the selected controls should be implemented, verified, and documented. controls may be implemented at the system level (e.g., audit trail). the implementation of procedural controls should be considered at a corporate, site, or department level, as appropriate, to minimize unnecessary duplication of procedures.\n\n15.3 electronic signature controls\n\na regulated signature is a signature required by a gxp regulation. regulated signatures include signatures that document that specific events/actions occurred in accordance with a gxp regulation (e.g., approval, review, or verification of a regulated record). a regulated electronic signature is a regulated signature applied electronically, and intended to be the equivalent of a handwritten signature required by a gxp regulation.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "de63c400-c1db-4618-b917-0c78f43ea26e": {"__data__": {"id_": "de63c400-c1db-4618-b917-0c78f43ea26e", "embedding": null, "metadata": {"page_label": "108", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Electronic Signature Requirements and Implementation Options in the US Regulatory Framework: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide Appendix D3 define the distinction between electronic signatures and identification events within the US regulatory framework, specifically in relation to 21 CFR Part 11 requirements?\n\n2. What specific criteria does the ISPE GAMP\u00ae Guide Appendix D3 outline for the implementation of electronic signatures to ensure they meet the requirements of 21 CFR Part 11, including the necessary controls and information that must be clear for each signature?\n\n3. What implementation options does the ISPE GAMP\u00ae Guide Appendix D3 suggest for ensuring compliant electronic signatures in regulated environments, and how does it describe the appropriate level of control based on the impact and vulnerability of the electronic signature system?", "prev_section_summary": "This section discusses the principles and controls for records and data integrity, as well as the application of electronic signatures according to the ISPE GAMP(r) Guide. It covers different levels at which controls can be applied, including organizational, infrastructure, system, database, record, and field levels. The section also addresses the various control measures that can be used to manage identified risks, including behavioral, procedural, and technical controls. It emphasizes the importance of implementing, verifying, and documenting selected controls, particularly in the context of electronic signatures required by GxP regulations.", "excerpt_keywords": "ISPE, GAMP Guide, electronic signatures, 21 CFR Part 11, implementation options"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d3\n\nwithin the us regulatory framework, the electronic signature requirements of 21 cfr part 11 [2] apply to electronic signatures that are intended to be the equivalent of handwritten signatures, initials, and other general signings required by predicate rules. signatures are not automatically part 11 signatures because they are executed in a regulated system. see fda guidance for industry: part 11, electronic records; electronic signatures - scope and application (section 2: definition of part 11 records) [16].\n\nelectronic signatures should be distinguished from identification events:\n\n- an electronic signature can be regarded as a specific event in the life cycle of a record. a record can be verified, reviewed, or approved, and the status of the record changed (e.g., from draft to final, or unapproved to approved) by the application of an electronic signature.\n- identification events (that may also be required by regulations) are those events where the requirement is only for the identification of an individual performing a particular activity. for example, this may be achieved by the logging of an event by a validated computerized system.\n\nelectronic signatures may be implemented by a unique user id and password combination. other uses of user ids and password, such as logging on to a system, acknowledgement of alarms, or identification of individuals are not electronic signatures.\n\nregulated companies should define when electronic signatures are required in regard to their own processes and circumstances, along with their interpretation of gxp regulations. where signatures are applied electronically, appropriate electronic signature controls should be applied.\n\nthe following should be clear for each electronic signature:\n\n- the identity of the signer\n- the date and time when the signature was executed\n- the meaning of the signature (such as verification, review, or approval)\n\nthis information should be clear to any reader or user of the record, e.g., included as part of a human readable form of the signed record, and should be permanently linked to the record, such that it cannot be removed, altered, or copied by ordinary means.\n\nthe following implementation options may be considered when deciding upon a suitable approach to ensuring compliant electronic signatures. the appropriate level of control will depend upon the level of impact and vulnerability:\n\n- method for ensuring uniqueness of electronic signature components, including prohibition of reallocation of user ids\n- prevention of deletion of electronic signature related information after the electronic signature is applied\n- biometrics - \"a method of verifying an individuals identity based on measurement of the individuals physical feature(s) or repeatable action(s) where those features and/or actions are both unique to that individual and measurable.\" (21 cfr part 11) [2])\n- digital signature - \"an electronic signature based upon cryptographic methods of originator authentication, computed by using a set of rules and a set of parameters such that the identity of the signer and the integrity of the data can be verified\". (21 cfr part 11 [2])\n- technical or procedural approaches to ensure the integrity of the link between electronic signature and record (especially if handwritten signatures are applied to electronic records)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "86f1119d-a77e-4af0-b681-7897a7cbb18a": {"__data__": {"id_": "86f1119d-a77e-4af0-b681-7897a7cbb18a", "embedding": null, "metadata": {"page_label": "109", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Record and Data Controls: Security, Access Management, Backup, and Restore Implementation Guide", "questions_this_excerpt_can_answer": "1. What specific measures are recommended for ensuring the security and access management of electronic records in pharmaceutical environments, as outlined in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest handling the delegation of electronic signature responsibilities during periods such as holidays or absences to maintain data integrity?\n\n3. What are the detailed considerations and options for implementing backup and restore controls for pharmaceutical data and records, according to the guidelines provided in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the electronic signature requirements within the US regulatory framework, specifically in relation to 21 CFR Part 11. It distinguishes electronic signatures from identification events and outlines criteria for their implementation to meet regulatory requirements. The section also covers the necessary controls and information that must be clear for each signature, including the identity of the signer, date and time of the signature, and the meaning of the signature. Implementation options for compliant electronic signatures are discussed, such as uniqueness of components, prevention of deletion of signature information, biometrics, digital signatures, and ensuring the integrity of the link between the signature and the record.", "excerpt_keywords": "Records, Data Integrity, Security, Access Management, Backup and Restore"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nmethod of display or print of signed records\n\nprocedure for delegation of electronic signature responsibilities (e.g., covering holidays or periods of absence)\n\noptions for entry of all or some components of multiple component electronic signatures\n\n## implementation of record and data controls\n\ncontrols may be implemented in different ways and with differing degrees of rigor. table 15.1 shows how various types of controls can be implemented.\n\n|control|implementation considerations and options|\n|---|---|\n|security and access management|- physical security\n- formal access authorization\n- confirming identity of new user before granting access\n- unique user identification\n- providing defined profiles for individual users or groups\n- clear separation of server administration, application administration, and user roles and responsibilities\n- limiting write, update, or delete access (e.g., to key users)\n- enforced password changing\n- enforced minimum password length and format\n- idle time logout (inactivity logout or timeout)\n- management of lost or compromised passwords\n- group access (sharing of access accounts)\n- proactive monitoring for attempted breaches\n- automated measures on attempted unauthorized access (e.g., lock account, notify management)\n- limiting and controlling use of superuser accounts\n- testing and renewal of identity devices or tokens\n- access revocation:\n- - access rights change and removal process\n- hr monitoring of staff changes\nperiodic access rights review\n|\n|backup and restore|- frequency of backups\n- auto or manual processes\n- backup verification\n- backup media\n- storage conditions\n- storage location(s) including remote storage locations\n- media management (e.g., labeling, storage, rotation, refresh)\n- high availability system architecture\n- mirroring and redundancy\n- periodic restore verification\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "fad4750c-084b-4e1d-84ee-e06c9295a3fa": {"__data__": {"id_": "fad4750c-084b-4e1d-84ee-e06c9295a3fa", "embedding": null, "metadata": {"page_label": "110", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Data Integrity and Records Management Controls in ISPE GAMP(r) Guide Appendix D3: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What specific considerations and options does the ISPE GAMP(r) Guide Appendix D3 recommend for ensuring disaster recovery and business continuity in the context of records and data integrity management?\n\n2. How does the ISPE GAMP(r) Guide Appendix D3 detail the implementation of audit trails in terms of events to be tracked, the nature of the audit trail (automatic, manual, or a combination), and the security measures recommended for maintaining the integrity and confidentiality of the audit trail data?\n\n3. What guidelines does the ISPE GAMP(r) Guide Appendix D3 provide regarding the copying and retention controls for electronic records, including the format of copies, the process for producing copies, retention periods, and procedures for ensuring the integrity and accessibility of retained data over time?", "prev_section_summary": "The section discusses the importance of records and data integrity in pharmaceutical environments, focusing on security, access management, backup, and restore controls. Key topics include methods for displaying or printing signed records, delegation of electronic signature responsibilities, and options for implementing controls such as security measures and access management. The section also outlines considerations for backup and restore processes, including frequency, verification, storage conditions, and system architecture. Entities mentioned include physical security, access authorization, user identification, password management, backup media, storage locations, and system redundancy.", "excerpt_keywords": "ISPE, GAMP, data integrity, records management, audit trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d3 records and data integrity\n\n|control|implementation considerations and options|\n|---|---|\n|disaster recovery and business continuity|- service level agreements\n- formal contracts for restoration of service\n- high availability system architecture\n- assessment of possible failure modes\n- defined allowable time of outage\n- recovery mechanisms (e.g., hot standby, procedural)\n- documented testing of the disaster recovery and business continuity plan\n- defined recovery point objective and recovery point time for different systems\n- documented procedures for business continuity and number of personnel trained in these procedures\n|\n|audit trail|- which events are audit trailed (record creation, modification, deletion)\n- reason configurable/predefined or free text\n- purpose, e.g.: - as a part of normal business data verification\n- for auditing of authorized or unauthorized changes to data\n- type (automatic, manual, combination)\n- date and time stamped\n- identification of time zone\n- amount of information retained (who/what/when)\n- access control and security of the audit trail\n- ability to change the audit trail\n- retention of the audit trail\n- backup and restore of the audit trail\n- procedures for managing the audit trail\n- retention of previous versions of data\n|\n|copying and retention controls|- format of copy (e.g., common portable electronic, paper)\n- reference to original on copy\n- relationship with original (e.g., exact copy, summary)\n- search, sort, and trend capabilities\n- process for producing copies (time required, access levels)\n- checksums\n- retention periods\n- definition of what is being retained\n- retention of associated data (e.g., audit trails, configuration information)\n- indexing and searching to aid retrieval\n- capacity limits\n- automatic or requiring human intervention\n- ability to reprocess data\n- formal disposal procedure\n- periodically testing ability to retrieve records throughout retention period\n- media maintenance procedures throughout retention period: - ability to read physical media\n- dependence on original version of software application\n- dependence on original version of operating system\n- dependence on original configuration of hardware\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "47523fe7-3a09-4a73-a211-4ec70082c17b": {"__data__": {"id_": "47523fe7-3a09-4a73-a211-4ec70082c17b", "embedding": null, "metadata": {"page_label": "111", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in Regulated Systems: Implementation Considerations", "questions_this_excerpt_can_answer": "1. What specific software controls are recommended in the ISPE Records and Data Integrity Guide for ensuring data integrity in regulated systems, and how do these controls align with business rules and GxP requirements?\n\n2. How does the ISPE Records and Data Integrity Guide suggest regulated companies should manage the lack of availability of specific technical controls in automated systems to maintain the integrity of electronic records and signatures?\n\n3. What roles and responsibilities does the ISPE Records and Data Integrity Guide assign to suppliers of systems in terms of documentation and audit readiness for electronic records and signatures management in regulated environments?", "prev_section_summary": "The section discusses the implementation considerations and options recommended by the ISPE GAMP(r) Guide Appendix D3 for ensuring disaster recovery and business continuity, audit trail implementation, and copying and retention controls for electronic records. Key topics include service level agreements, high availability system architecture, audit trail events, format of copies, retention periods, and procedures for managing the audit trail and retained data. Key entities mentioned include disaster recovery mechanisms, audit trail information, retention periods, and media maintenance procedures.", "excerpt_keywords": "ISPE, Records, Data Integrity, GxP requirements, Electronic signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix d3\n\n|control|implementation considerations and options|\n|---|---|\n|software controls|- functional controls based on business rules and gxp requirements\n- user identity checks\n- interfaces: - checksums and other verification of data transfer\n- standard network protocols for data transfer\n- automatic functionality to reduce human error, e.g.: - use of barcodes or other electronic data reading functionality\n- sequence enforcement\n- creation of predefined text and lists\n- measurement redundancy in critical applications\n- data entry checking\n- error handling\n- alarms\n- notification of software failure\n- prompting for confirmation of action\n- monitoring tools (e.g., event logs)\n|\n|hardware controls|- mirrored or raid drives\n- ups\n- contingency in sizing of hardware\n- network monitoring (could be also software control)\n- virtualization management\n|\n|policies and procedures|- formality of policies and procedures\n- extent of qa involvement\n- formality and roles involved in authorization\n- formality and roles involved in review\n- formality and roles involved in approval\n- internal audit processes to confirm adherence to procedures\n|\n|training and experience|- importance of data integrity principles\n- training and experience of users\n- training and experience of developers of systems (both regulated companies and suppliers)\n- significance of electronic signatures in terms of individual responsibility\n- consequence of falsification\n- use of electronic signatures\n|\n\nmany of the controls identified in table 15.1 are technical in nature and will form part of the functionality of a supplied system. suppliers should be aware that these controls may be typical requirements for systems supplied to regulated companies. suppliers should be prepared for assessments, including auditing, to ascertain that technical controls have been implemented appropriately, as regulated companies have ultimate responsibility for the system in use. suppliers should provide documentation that defines which electronic records and signatures a system can maintain. the controls available to help manage electronic records and signatures should also be described; regulated companies can use this information during the risk management process. where it is not possible to implement specific technical controls, e.g., lack of availability in the automated system, the use of alternative technical or procedural controls should be considered. the use of several procedural controls may produce sufficient collaborative information to support evidence for the control of electronic records.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "11fee860-d56f-42cd-b25b-ddb1cdeff9c9": {"__data__": {"id_": "11fee860-d56f-42cd-b25b-ddb1cdeff9c9", "embedding": null, "metadata": {"page_label": "112", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Implementing Robust Electronic Record Management Systems for Compliance in Regulated Industries\"", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP\u00ae Guide suggest regulated companies manage the risk and impact associated with electronic records to ensure compliance in regulated industries?\n \n2. What specific factors should be considered when determining the rigor of controls needed for electronic record management systems according to the ISPE Records and Data Integrity Guide?\n\n3. What are the two approaches recommended by the ISPE Records and Data Integrity Guide for applying controls to electronic records, and how do these approaches differ based on the identified risks associated with different types of records?", "prev_section_summary": "The section discusses the importance of records and data integrity in regulated systems, outlining specific software controls, hardware controls, policies and procedures, and training considerations to ensure data integrity. It emphasizes the need for technical controls aligned with business rules and GxP requirements, as well as the roles and responsibilities of suppliers in providing documentation and audit readiness for electronic records and signatures management. The section also highlights the use of alternative technical or procedural controls when specific technical controls are not available in automated systems to maintain data integrity.", "excerpt_keywords": "ISPE, GAMP, electronic records, data integrity, compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d3 records and data integrity\n\nsuppliers may provide administrative features and utilities which can make the implementation of procedural controls more efficient, consistent, and secure. for example, the inclusion of a system workflow to route lists of authorized users to process owners on a periodic basis for review.\n\n### 15.5 rigor of controls\n\nthe rigor with which the controls are applied should consider both the impact of the electronic record and the risks identified. as the impact and risk increase, more rigorous controls are required, as shown in figure 15.1.\n\n|figure 15.1: rigor of required controls|\n|---|\n|increasing severity of impact|increased rigor of control required|\n|increased effect on:|consider:|\n|- patient safety|- more controls|\n|- product safety|- more frequent controls|\n|- gxp compliance|- automatic controls|\n| |- increased internal audits|\n|increasing probability| |\n|decreasing detectability| |\n|increased potential for:| |\n|- loss of record| |\n|- corruption of record| |\n|- wrong record| |\n|- lack of detection| |\n\nfor electronic records, regulated companies should consider the need for:\n\n- authenticity\n- integrity\n- accuracy\n- reliability\n- confidentiality (where appropriate)\n\na combination of technical and procedural controls may be needed to achieve an adequate level of protection. for systems containing multiple types of records, two approaches are:\n\n1. apply controls to all records appropriate to the highest identified risk\n2. apply controls to individual record types appropriate to the identified risk for each type", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "97cddee0-5799-4d08-935b-386befe953ee": {"__data__": {"id_": "97cddee0-5799-4d08-935b-386befe953ee", "embedding": null, "metadata": {"page_label": "113", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Managing Data Integrity Concerns in Local Hard Disk Systems: Strategies and Best Practices", "questions_this_excerpt_can_answer": "1. What specific strategies are recommended for managing data integrity concerns on systems where data resides on a local hard disk, particularly in relation to older laboratory instrument control systems?\n \n2. How does the document address the issue of attributability and audit trails for data stored on local hard disks, and what are the suggested measures to ensure data integrity in such environments?\n\n3. What are the recommended practices for backup and archiving of data that is stored on local hard disks according to the ISPE Records and Data Integrity Guide, especially in contexts where the application does not include an inherent archive function?", "prev_section_summary": "The section discusses the importance of implementing robust electronic record management systems for compliance in regulated industries, as outlined in the ISPE Records and Data Integrity Guide. It emphasizes the need for considering the impact and risks associated with electronic records when determining the rigor of controls needed. The section also highlights the factors to be considered, such as patient safety, product safety, and GXP compliance, in determining the level of controls required. Additionally, it mentions the need for authenticity, integrity, accuracy, reliability, and confidentiality in electronic records, and suggests two approaches for applying controls based on identified risks.", "excerpt_keywords": "Data Integrity, System Architecture, Local Hard Disk, Laboratory Instrument Control Systems, Backup and Archiving"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## appendix d4 - data integrity concerns related to system architecture\n\nthis appendix addresses different architectures and approaches to managing data integrity issues that relate to each. the architecture of applications will impact the controls that are appropriate to ensure data integrity. architectures that should be considered range from the c:\\ drive on a pc that collects or processes gxp data to saas applications for which the data owner may not know where a particular record resides. architecture choices can have direct and obvious data integrity impact; others will have more subtle and indirect impact.\n\n### data resides on a local hard disk\n\nthis may be the simplest architecture, but these systems can have the greatest vulnerability because of the lack of built-in controls. this may be particularly problematic for laboratory instrument control systems which have not been designed in regard to data integrity. this is likely the case with older instruments. where applications do not provide adequate protection, operating system (os) level controls should be implemented where possible. in cases where an application does have sufficient protection, os level controls should be established to ensure that the application level controls cannot be avoided by accessing the data directly through the os. the following should be considered:\n\n- attributability: login should be required to ensure that a record created on the system is attributable to the person who created it. if this is not possible, consideration should be given to upgrading or replacing the system. a logbook may be kept, but this may be ineffective.\n- audit trails: elements that may make audit trails less trustworthy include:\n- improper control of the system clock\n- controlled data access at the operating system level\n- lack of attributability\n- segregation of duties: os level access to data should be limited to it. laboratory analysts should not have administrator rights on instrument controllers or data systems.\n- protection from hazards: in cases where storage media are exposed to unusual risk (e.g., potential vulnerability of laboratory or manufacturing systems to spills or other hazards) measures should be considered that could minimize such exposure, e.g., placing the systems in a safer area or in an enclosure.\n- backup: locally stored records should be protected. data should be accumulated to a managed network drive instead of a local hard disk. an alternative is an automated backup process where local files are automatically copied periodically to a managed network drive. a properly executed manual backup may also be used. note: backup media should be suitably protected and should be stored in a remote location.\n- archive: archiving may be a simple process if the pc application includes an archive function. if the pc application does not include an archive function, it may be difficult to manually move all data and associated metadata to archive media. archive media should have a level of protection similar to back-up media.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3ae528d4-5ec4-41df-a9fd-0ad5bca070d1": {"__data__": {"id_": "3ae528d4-5ec4-41df-a9fd-0ad5bca070d1", "embedding": null, "metadata": {"page_label": "114", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity and Disaster Recovery in ISPE GAMP(r) Guide: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific architectural approach does the ISPE GAMP(r) Guide recommend for ensuring the integrity of data in internally managed central databases, and how does it address attributability and segregation of duties?\n \n2. How does the ISPE GAMP(r) Guide suggest handling backup processes for internally managed central databases to ensure data integrity, and what unique considerations does it recommend for businesses with seasonal or infrequent application use?\n\n3. What are the guidelines provided by the ISPE GAMP(r) Guide for managing archives in relation to the data life cycle, and why does it advise against retaining backups as a substitute for a true archive system?", "prev_section_summary": "The section discusses data integrity concerns related to system architecture, specifically focusing on systems where data resides on a local hard disk. Key topics include attributability, audit trails, segregation of duties, protection from hazards, backup, and archiving. The section emphasizes the importance of implementing controls at the operating system level when applications do not provide sufficient protection, particularly in the context of older laboratory instrument control systems. Recommendations are provided for ensuring data integrity in such environments, such as requiring login for attributability, limiting OS level access, and implementing backup and archiving processes to protect locally stored records.", "excerpt_keywords": "ISPE, GAMP, data integrity, disaster recovery, central database"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d4\n\n### disaster recovery\n\nthere should be specific plans to deal with system loss. stating that a new pc will be obtained may not be acceptable, as a pc with the correct configuration may not be available.\n\n### records and data integrity\n\n- data accessibility: local user ability to access individual files on the system, restoring deleted files, renaming files, etc.\n\n### internally managed central database\n\nthese systems are either server-based applications or pc-based applications where all data management occurs outside the pc. the key to this is that such architecture should be among the easiest to properly manage to ensure integrity of data. data integrity protection should address:\n\n|attributability|based on login. in this architecture, it can be harder to justify a paper-based control such as a logbook.|\n|---|---|\n|segregation of duties|administrative rights should be limited to it professionals. it may be preferable to assign some limited administrative functions, such as approval of user rights, to the business. there should be no potential conflict of interest.|\n|backup|in general, this will be handled through enterprise processes owned by it. there should be a standard periodicity to take incremental and full backups. backup media should be stored securely (usually offsite). media may be recycled according to a standard practice, e.g., only the four most recent copies are retained, with the fifth iteration being overwritten onto the media used for the first.|\n\nhowever, the business process owner should ensure that such an enterprise process is compatible with the actual business process. for example, if an application is only used in january to compile annual summaries, the process described would not work. if the data became corrupted between february and august, the next time the database is opened in january the corruption would be found, but the corruption would have been propagated to all existing backup copies.\n\n- archive\n\narchives should be managed in alignment with the data life cycle. this includes the destruction of all archive copies, including backups, when the records reach the end of their retention period.\n\nnote: the retention of backups in lieu of a true archive is not recommended. it can make record destruction problematic, as it is very difficult to selectively remove expired records. restoring a backup to access archived records could have significant business impact.\n\n- this should include testing, since there are likely to be dependencies on other enterprise-owned assets. disaster recovery planning should follow a well-defined risk-based process, so that systems with major patient safety or business impact are appropriately scheduled in case of a wide-ranging disaster.\n\n### internally managed distributed data\n\ndistributed systems require the same protection as centralized systems. added complications could occur based on two architecture subtypes.\n\n### locally unique data accessible globally\n\nlocal databases may be used to achieve desired performance of the system at multiple sites. this generally does not involve managing local record copies at sites other than the one at which the records were generated. a small subset may have local copies that were saved in accordance with local business practice. for example, manufacturing records for products made offshore may be copied locally to support a regulatory compliance expectation.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "08e8e0ac-89dc-4fe3-97be-080267d86f84": {"__data__": {"id_": "08e8e0ac-89dc-4fe3-97be-080267d86f84", "embedding": null, "metadata": {"page_label": "115", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Compliance in Global Information Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What are the specific considerations and steps regulated companies should take when managing data replicated globally, especially concerning the destruction of records across different jurisdictions?\n \n2. How should regulated companies approach outsourcing data management services, particularly cloud-based solutions, to ensure compliance with data integrity and regulatory requirements in a GxP environment?\n\n3. What are the key differences in expectations regulated companies should have when evaluating cloud solution providers for GxP processes, especially those not primarily serving the pharmaceutical industry, in terms of documentation and process formalities?", "prev_section_summary": "The section discusses disaster recovery and data integrity in the context of the ISPE GAMP(r) Guide. Key topics include specific plans for system loss, data accessibility, internally managed central databases, backup processes, archive management, and disaster recovery planning. Entities mentioned include attributability, segregation of duties, backup media storage, archive management, and disaster recovery testing. The section emphasizes the importance of aligning archive management with the data life cycle and highlights the risks of retaining backups as a substitute for a true archive system.", "excerpt_keywords": "Data Integrity, Compliance, Global Information Systems, Outsourcing, Cloud Solutions"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nappendix d4\n\nthe local databases should be managed similarly to the centralized system described above. complication may occur regarding treatment of data that is required to be retained in other jurisdictions. for global information systems, regulated company processes should account for the use of information at other sites when making decisions related to archive management and data destruction.\n\n### 16.3.2 data replicated globally\n\nthe issues described for centralized systems apply and should be appropriately addressed for data replicated globally. for replicated data when records are scheduled for destruction, all copies of the record should be destroyed. this should account for all locally archived copies in addition to copies in the active database. failure to do so could expose the regulated company to legal discovery liabilities.\n\nthe second complication with centrally stored records is that retention policies need to recognize the potentially differing requirements based on the applicable jurisdictions. for example, some blood product records need to be retained for ten years in the united states, whereas the same records need to be retained for thirty years in europe and japan. therefore, knowledge of where the product has been distributed is key in determining the timing of the steps in the data life cycle.\n\nthe same considerations described above apply to any centrally managed archive.\n\n### 16.4 outsourced managed services\n\nwhile this section concentrates on cloud-based solutions, considerations described may also be applicable to any outsourced or externally managed infrastructure and/or applications. as part of an outsourcing process regulated companies need to:\n\n- understand and accept which aspects of control are being delegated to a provider\n- assess and accept the controls implemented by the provider\n- contractually define the level and frequency of reporting\n- agree the need for supplier support during regulatory inspections, depending on the architecture and services provided\n\nif any of the above are not satisfactory, the decision to outsource should be revisited.\n\nfor additional further information on cloud-based solutions in a gxp environment, see the pharmaceutical engineering magazine articles by david stokes \"compliant cloud computing - managing the risks\" [28] and the ispe gamp (r) cloud computing special interest group (sig) \"cloud computing in a gxp environment: the promise, the reality and the path to clarity\" [29].\n\nin general, the evaluation process, the controls, and the complexity of contractual and service level agreements should increase in line with the amount of control the regulated company is transferring to the cloud provider. from lowest to highest this is iaas - paas - saas. each type of solution should have the controls discussed for the lower level solution(s) in addition to those discussed at that level.\n\nregulated companies may consider cloud solutions for gxp processes that have not been specifically developed for the gxp world. when evaluating such a supplier, the regulated company should not expect to see the same processes that would be found in a supplier whose primary customer is the pharmaceutical industry. documentation may be less formal; management approval may not be required in as many places, etc. the emphasis should be on evaluating the state of control over the high-risk processes. the regulated company should look for \"gxp-compatible processes.\"", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3e06f4c7-dfff-4bce-8fab-fa6f44831c15": {"__data__": {"id_": "3e06f4c7-dfff-4bce-8fab-fa6f44831c15", "embedding": null, "metadata": {"page_label": "116", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in Cloud-Based Systems: Considerations for Regulated Companies", "questions_this_excerpt_can_answer": "1. What specific considerations should a regulated company take into account when assessing the risks related to using cloud storage (Infrastructure as a Service, IaaS) for data management, especially concerning the location of data storage and the encryption of data?\n\n2. How should a regulated company approach the issue of administrative rights within cloud providers, especially in cases where a provider's internal policy allows a large number of staff to have administrative access, and what compensating controls might be considered acceptable?\n\n3. In the context of engaging with a Platform as a Service (PaaS) supplier, what are the key considerations a regulated company should have regarding change control, especially in terms of support for older software versions and the potential data integrity risks associated with forced upgrades and data migrations?", "prev_section_summary": "This section discusses the importance of data integrity and compliance in global information systems, specifically focusing on managing data replicated globally and outsourcing data management services. Key topics include the management of local databases, considerations for data destruction across different jurisdictions, retention policies for centrally stored records, and the outsourcing process for cloud-based solutions. The section emphasizes the need for regulated companies to understand and accept the controls implemented by providers, define reporting requirements, and ensure supplier support during regulatory inspections. Additionally, it highlights the differences in expectations when evaluating cloud solution providers for GxP processes and emphasizes the importance of evaluating the state of control over high-risk processes in cloud solutions.", "excerpt_keywords": "Data Integrity, Cloud Storage, Regulated Companies, Change Control, Disaster Recovery"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d4 records and data integrity\n\nthe regulated company should consider whether there are reasonable and appropriate controls that ensure data integrity.\n\n### 16.4.1 internally managed with cloud storage (infrastructure as a service (iaas))\n\nthe requirements for these systems are the same as for internally managed centralized systems, except that some of the tasks of managing the data will fall to external personnel. the following should be accounted for in assessing the risks related to this architecture:\n\n- data management processes at the cloud provider need to be assessed to make sure that the regulated company is satisfied that the providers controls are adequate. if there are some countries where the regulated company does not want data stored, this needs to be contractually agreed.\n- depending on the level of access and the type and format of the information being processed or stored in the cloud the regulated company may decide that the data should be encrypted.\n- some cloud providers may have internal policies that give administrative rights to dozens or hundreds of staff, believing that they need to have the internal flexibility to assign any employee to work on any contract. while this is probably not necessary, a regulated company is unlikely to be able to convince a cloud supplier to change that model. the regulated company should assess whether this can be acceptable, possibly with additional compensating controls.\n- supplier change control processes should be evaluated to ensure that proper and timely notification is given for changes that may impact data integrity.\n- disaster recovery processes should be assessed to ensure that they will restore data access in a time frame acceptable to the regulated company. this should include a mutually agreeable recovery time objective (rto, or how quickly service is restored), and this should be included in the sla. recovery point objective (rpo), or how much data can be lost since the last backup, is the responsibility of the customer (the regulated company), as they are managing the database on the supplier equipment. however, if data backup is contracted to the supplier, this can affect the ability to meet the rpo.\n- before entering into an arrangement with a cloud service there should be an agreed and well-defined process for disengagement. this should address timing, including both advanced notice of intent to sever the relationship and the time allowed to do it, supplier and regulated company responsibilities, and cost. disengagement should also include the removal of data from provider owned equipment, and if applicable backup media.\n\n### 16.4.2 internally managed application with cloud based platform\n\nsolutions involving a platform as a service (paas) supplier should include consideration of all of the above plus:\n\n- change control will have wider impact, as it will extend beyond hardware and operating systems and into layered software. when entering into an agreement with a paas supplier, it should be clarified what the suppliers policy is related to support of older software versions. for example, if the providers policy is to support only the current and one older version of a database it may drive more upgrades than the regulated company desires, and such upgrades may require data migrations with all the associated data integrity risks.\n- disaster recovery responsibilities will move more toward the supplier. in addition to rto, the supplier will probably be charged with meeting the rpo requirements as well; therefore, rpo should be covered in the sla.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2cd0847d-0817-4d1f-8621-817ec70791b6": {"__data__": {"id_": "2cd0847d-0817-4d1f-8621-817ec70791b6", "embedding": null, "metadata": {"page_label": "117", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Accountability in SaaS Systems for Regulated Companies and Suppliers: Key Considerations", "questions_this_excerpt_can_answer": "1. What specific considerations should be made when a regulated company is negotiating contracts with SaaS providers to ensure data integrity and accountability, especially regarding the management and access rights of database administrators (DBAs)?\n \n2. How should regulated companies approach the issue of SaaS providers using customer data for testing software changes, including the necessary precautions to protect confidential data?\n\n3. In the context of adopting SaaS systems within regulated industries, how can companies reconcile the use of agile development processes by SaaS suppliers with the traditional expectations of Good Practice (GxP) compliance, particularly concerning software development documentation and quality control?", "prev_section_summary": "This section discusses considerations for regulated companies when assessing data integrity in cloud-based systems, specifically focusing on Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) suppliers. Key topics include assessing risks related to cloud storage, encryption of data, administrative rights within cloud providers, change control processes, disaster recovery processes, and disengagement processes. Entities mentioned include regulated companies, cloud providers, data management processes, disaster recovery responsibilities, and supplier change control processes.", "excerpt_keywords": "Data Integrity, SaaS Systems, Regulated Companies, Cloud Providers, Agile Development"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 115\n\n### records and data integrity appendix d4\n\n* in addition, staff at the provider may now be directly managing data, e.g., as a database administrator (dba). it should be understood and documented what the dba can do. for example, is the dba allowed to make direct data changes, and if so what controls are in place for that? dba access to confidential data may also make encryption advisable. the impact of a supplier policy of wide granting of administrative rights has greater potential data integrity impact if it applies to dba access, as well as to hardware support.\n\n* suppliers may have multiple data centers and will distribute load in order to balance the demand. this could entail placing data in one country and manipulating the platform from another. if either of these are unacceptable to the regulated company restrictions need to be contractually negotiated.\n\n### 16.4.3 software as a service (saas)\n\nevery issue noted above applies to saas systems, but aside from decisions related to record retention virtually all of the data management activities are carried out by the supplier. it should be recognized that the regulated company remains accountable for the integrity of the data, regardless of the fact that it is being managed by a service provider. this means that the contract and sla should be written to ensure mutually agreeable controls are in place. some of these controls will have direct data integrity impact, and others indirect. specific points that should be addressed include:\n\n- some saas providers execute non-optional changes periodically. for the most part, this is not likely to have a negative impact, but for gxp applications there needs to be sufficient prior notification to allow testing, and defined processes at the regulated company for dealing with the impact of both successful and unsuccessful testing.\n- it may be that the supplier wants to use customer data to test software changes. such testing should only be allowed with the express permission of the regulated company. some precautions like deidentification or masking of confidential data may be advisable.\n- a provider may have internal processes for incident management that delay reporting of serious issues to customers pending preliminary investigation. depending on the application, this might be unacceptable to the regulated company. this needs to be outlined in the sla.\n- similarly, a saas provider may be reluctant to activate disaster recovery processes because of the marketing fallout of a declared disaster. as a result, they may allow themselves a few hours to troubleshoot before declaring a disaster, and this may impact data collection and processing during this early stage of a disaster. it is incumbent upon the customer (regulated company) to examine and understand the providers disaster recovery procedures.\n- as with paas, the saas supplier may want to move or archive data at other locations, and they may not even know where at the time of engagement. the sla should address whether this can be allowed, or at least timely notification of such actions.\n\nwhen evaluating a saas supplier regulated company auditors are unlikely to find software development practices that are aligned with traditional gxp expectations. for example, many saas suppliers use some form of agile development process, and some interpretations of the agile manifesto avoid documentation in favor of rapid results. evaluation should concentrate on the state of control of the software development process, focusing on compensating controls and any documentation that is eventually produced. an agile sdlc that allows for quality and documentation to be defined and used, as required, can be used to support validation. companies would not employ agile methodologies if they could not produce reliable software when used properly - \"gxp compatible\" processes should be the goal.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f7a36bc3-a6c1-4aa3-b687-bee182117fc8": {"__data__": {"id_": "f7a36bc3-a6c1-4aa3-b687-bee182117fc8", "embedding": null, "metadata": {"page_label": "118", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity in the Life Sciences Industry: A Guide to Navigating Records with SaaS Suppliers\"", "questions_this_excerpt_can_answer": "1. What specific challenges might a SaaS supplier face when attempting to enter the life sciences industry, according to the ISPE Records and Data Integrity Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest a regulated company should assist a SaaS supplier in developing processes that meet the stringent requirements of the life sciences industry?\n\n3. According to the document, what strategic approach does the ISPE Records and Data Integrity Guide recommend a prospective customer take when considering the engagement of a SaaS supplier with no experience in serving regulated company clients in the life sciences sector?", "prev_section_summary": "The section discusses key considerations for ensuring data integrity and accountability in SaaS systems for regulated companies and suppliers. It covers topics such as negotiating contracts with SaaS providers, managing database administrators, data encryption, data center locations, software changes testing, incident management, disaster recovery procedures, and agile development processes. The section emphasizes the importance of mutually agreeable controls in contracts and SLAs to maintain data integrity and compliance with Good Practice (GxP) standards. It also highlights the need for regulated companies to understand and oversee the data management activities carried out by SaaS providers to ensure data integrity and accountability.", "excerpt_keywords": "ISPE, Records, Data Integrity, SaaS, Life Sciences Industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n# page 116\n\nispe gamp(r) guide: appendix d4\n\nrecords and data integrity\nsome concerns may require a degree of compromise on pe part of a saas supplier and pere may need to be some investment in documentation and process changes. this may entail resistance from pe supplier. however, pe life sciences industry is among pe most heavily regulated and any technology company desiring to enter pis arena needs to be aware of pis. the regulated company should be prepared to assist pe supplier develop acceptable processes.\na prospective customer needs to consider pe risks of engaging a saas supplier wip no experience of regulated company clients. while flexibility in pe form of accepting gxp compatible processes is important, pe willingness to walk away if pe circumstances are not right should always remain on pe table. a risk management approach should always be applied to pe selection process.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bb326f96-34de-41f9-8ff3-a2d24f33ac30": {"__data__": {"id_": "bb326f96-34de-41f9-8ff3-a2d24f33ac30", "embedding": null, "metadata": {"page_label": "119", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in End User Applications: Risks, Controls, and Regulatory Compliance", "questions_this_excerpt_can_answer": "1. What are the common types of end-user applications mentioned in the ISPE Records and Data Integrity Guide, and why are they considered to have an increased risk to data integrity in GxP processes?\n\n2. According to the ISPE Records and Data Integrity Guide, what are the specific weaknesses in data attribution associated with spreadsheets as end-user applications, and how do these weaknesses impact data integrity?\n\n3. What guidance does the ISPE Records and Data Integrity Guide provide regarding the implementation of controls for spreadsheets to ensure data integrity, and what external resources does it reference for further guidance on spreadsheet management and validation in regulated environments?", "prev_section_summary": "The section discusses the challenges that a SaaS supplier may face when entering the life sciences industry, including the need for compromise, investment in documentation and process changes, and resistance from the supplier. It also highlights the importance of regulated companies assisting SaaS suppliers in developing processes that meet industry requirements. Additionally, the section emphasizes the need for prospective customers to consider the risks of engaging a SaaS supplier with no experience in serving regulated company clients, and the importance of applying a risk management approach to the selection process. Key entities mentioned include SaaS suppliers, regulated companies, and prospective customers.", "excerpt_keywords": "Data Integrity, End User Applications, Spreadsheets, Regulatory Compliance, GxP Processes"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n### appendix d5\n\n### appendix d5 - data integrity for end-user applications\n\n17.1 introduction\n\nend user applications are small applications that are typically created outside of traditional software development environments. they may be developed by the people who will use them and can range in complexity from users simply clicking a series of buttons to being able to modify and execute code directly. end user applications may be repeatedly created based on stored templates. examples include:\n\n- spreadsheets (the most frequent type)\n- small databases (regularly pc-based)\n- statistical programs (e.g., developed on a sas(r) platform)\n- computer programs (e.g., developed in visual basic)\n\nthe decision to use end user applications for gxp processes should be risk-based. the use of end user applications is considered as having an increased risk to data integrity. for example, there have been numerous us fda warning letters regarding the use of spreadsheets to manage gxp records.\n\nend user applications may have weaknesses in the attribution of actions. for example, by default a spreadsheet records the identity of the creator and of the last person to modify it, but users who have modified it in the interim are not tracked. similar issues may also apply to the use of statistical programs or other similar applications.\n\nend user applications usually need extra controls in order to satisfy regulatory expectations. appropriate controls should be implemented to mitigate risks to an acceptable level. end user applications and the data they generate should be stored in a secure manner. this can include the use of operating system controls, e.g., read/create/modify/delete access restrictions and auditing os level audit trails (if available).\n\n17.2 data integrity for spreadsheets\n\nspreadsheets are useful tools that are attractive for a variety of uses related to regulated activities. it is the flexibility that makes a spreadsheet a high-risk form of electronic record from a data integrity standpoint, particularly if the spreadsheet is not carefully controlled. limits should be set related to the manner in which spreadsheets are used and managed. ispe gamp(r) 5 provides a discussion of the various types of uses of spreadsheets and validation implications. there is also guidance available from regulators, e.g., us fda field science and laboratories: laboratory manual volume 3 - laboratory operations, applications and programs: section 4.5 - development and validation of spreadsheets for calculation of data; or dfs/ora laboratory information bulletin no.4317, spreadsheet design and validation for the multi-user application for the chemistry laboratory part 1 (2004).\n\n4 us fda warning letters can be viewed on the agency website: www.fda.gov.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "984c7055-96fa-49a0-b4f6-40c4ca0e711b": {"__data__": {"id_": "984c7055-96fa-49a0-b4f6-40c4ca0e711b", "embedding": null, "metadata": {"page_label": "120", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Managing Integrity of Spreadsheets in Regulated Environments: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific measures should be taken to ensure the integrity of spreadsheet templates used in regulated environments, according to the ISPE GAMP\u00ae Guide's appendix on records and data integrity?\n\n2. How does the ISPE Records and Data Integrity Guide recommend managing spreadsheets that are used for single-use analyses, such as investigating out-of-specification results or evaluating manufacturing trends, to ensure data integrity?\n\n3. What are the recommended practices for controlling static spreadsheets in environments where an Electronic Document Management System (EDMS) is not available, as outlined in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "This section discusses the importance of ensuring data integrity in end-user applications, such as spreadsheets, small databases, statistical programs, and computer programs. It highlights the increased risk to data integrity associated with the use of end-user applications in GxP processes and the specific weaknesses in data attribution that may occur. The section emphasizes the need for implementing controls to mitigate risks and ensure data integrity, particularly in the case of spreadsheets. It references guidance from ISPE GAMP 5 and regulators like the US FDA for further information on managing and validating spreadsheets in regulated environments.", "excerpt_keywords": "ISPE, GAMP, data integrity, spreadsheets, regulated environments"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d5 records and data integrity\n\nspreadsheets do not support audit trails. there may be add-on tools available to do so, but they are not commonly used. if a regulated company chooses to depend on an add-on tool, a thorough analysis of the capabilities and limitations of the add-on should be performed, as many of the controls described below may be required.\n\n### 17.2.1 spreadsheets that are simple documents\n\nthe easiest class of spreadsheets to manage are static tables. these should be controlled in the same way as word processing documents and their control can be managed within an electronic document management system (edms). edmss are designed to apply controls to documents that ensure that storage, access, versioning support compliance, and legal requirements.\n\nwhere an edms is not available, the primary challenge is control of the storage of documents. control issues are similar to those for other electronic files. saving the final version as a pdf can help to ensure a document is more difficult to edit. digital signature tools can be used as needed.\n\n### 17.2.2 spreadsheets that are templates\n\na frequent role for spreadsheets is for the repetitive usage of calculation algorithms. any integrity issues related to the template can spread to every record that is generated based upon that template; therefore, the integrity issues can have significant potential impact.\n\nit should be ensured that algorithms in templates are appropriate; however, it is not necessary to verify that a spreadsheet performs arithmetic accurately. templates should be independently verified and approved, prior to providing them for use. templates should be stored in a manner restricting the ability to alter them to a very small number of people. typically, this will be a combination of measures such as:\n\n- storage of the template in a directory that appropriately restricts write, edit, and delete access\n- users should only be able to copy the template to a separate, protected directory with limited access\n- password protection of all cells in the template, except for those where data is entered so that the algorithms cannot be edited during use\n- restriction of the ability to edit documents created using the template\n- traceability to the creator/editor of records created based on a template\n- where feasible, once records created from a template are finalized they should be stored in an immutable format, e.g., pdf\n- spreadsheets based on templates that support gxp decisions should be saved in a secure manner that prevents unauthorized and/or undetectable changes to the recorded data throughout the required retention period\n\n### 17.2.3 single use spreadsheets\n\nspreadsheets may be used to analyze a unique problem, such as investigating an out of specification result or evaluating a manufacturing trend.\n\nsingle use spreadsheets should be managed in a similar way to spreadsheets that are simple documents. the main difference is the integrity of the calculations. calculations should be verified as appropriate, but checking arithmetic is not considered necessary. capturing a view of the calculations can provide a long-term record that the calculations are correct.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "77c05a51-f23d-46e8-baca-d72546adb0b9": {"__data__": {"id_": "77c05a51-f23d-46e8-baca-d72546adb0b9", "embedding": null, "metadata": {"page_label": "121", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices and Considerations for Data Integrity and Database Management", "questions_this_excerpt_can_answer": "1. What specific measures are recommended for ensuring data integrity when using spreadsheet programs for data that requires additional entries over time?\n \n2. Why are spreadsheets not recommended for use as databases from a data integrity perspective, and what alternative is suggested for better control of data integrity?\n\n3. What are the key considerations and management practices recommended for ensuring data integrity in user-developed and centrally managed PC databases within regulated companies?", "prev_section_summary": "The section discusses the management of spreadsheet integrity in regulated environments according to the ISPE Records and Data Integrity Guide. Key topics include controlling static spreadsheets, managing spreadsheet templates, and handling single-use spreadsheets. Entities mentioned include electronic document management systems (EDMS), control measures for templates, verification of algorithms, and ensuring the integrity of calculations in single-use spreadsheets. The importance of document control, restricted access, password protection, and traceability to creators/editors is emphasized to maintain data integrity in spreadsheets.", "excerpt_keywords": "data integrity, spreadsheet programs, database management, user-developed tools, centrally managed databases"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\navailable spreadsheet programs may include tools that allow cell contents to be displayed and subsequently printed to paper or saved to pdf. if a spreadsheet of this type needs to be left open for additional data entry, it should be set up so that revision of the spreadsheet will require versioning. one mechanism would be to disallow saving over the existing version. administrative procedures defining protections are recommended. depending on the risks related to the record, the data and calculations may need to be independently reviewed by a second individual.\n\n## spreadsheets as databases\n\nspreadsheets should not be used as databases from a data integrity viewpoint. in addition, audit trails for the individual cell contents are not available, so that it is not possible to recreate changes without examining every single version. control of change tracking is generally not sufficient to meet gxp regulatory expectations, even if it can be enabled. where a desktop database is required, acceptable control of data integrity may be more achievable by using a real database engine. platforms other than a desktop or handheld device are recommended for controlling data integrity. note: just accessing a database using such a device is a more easily manageable risk.\n\n## data integrity for pc databases\n\n### user developed and managed tools\n\naspects of a purely local database that require protection are the same as those required for a server-based database, but they can be more difficult to administer. for example, segregation of duties is not possible if the user of the database is also the developer and owner. for this reason, the routine use of such tools for gxp processes is not recommended. when a user-developed and managed tool is developed for a specific problem, e.g., supporting an investigation, single-use applications would be expected. short-term it projects may not need to be executed under a formal sdlc with built-in data integrity protections. data integrity concerns should nonetheless be considered. once the investigation is completed, the database should be locked and securely stored.\n\n### centrally managed pc databases\n\nwhere a regulated company decides to use a pc database, an it group should manage the tool in the same fashion as a server-based database. this effectively makes the tool a server-based database on a different database engine and operating system. segregation of duties, access controls, backup, and archiving can be managed appropriately. however, some pc-based database engines may not be able to manage issues such as audit trails and role-based security in a way that would be expected by regulators. choosing to use a tool such as a pc database should be based on a formal risk assessment.\n\nthe main reason for this is that every time data is added to the database, a completely new version of the database is effectively created.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4bf7335c-c91e-41b8-8acf-37d2432584e8": {"__data__": {"id_": "4bf7335c-c91e-41b8-8acf-37d2432584e8", "embedding": null, "metadata": {"page_label": "122", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in User Developed Statistical Tools: Best Practices and Strategies", "questions_this_excerpt_can_answer": "1. What specific data integrity practices are recommended for user-developed statistical tools used in pharmaceutical data analysis, according to the ISPE GAMP\u00ae Guide's Appendix D5?\n \n2. How does the ISPE Records and Data Integrity Guide suggest controlling access and modifications to templates used in statistical analysis of pharmaceutical products, such as tablet weight distribution?\n\n3. What measures does the ISPE Records and Data Integrity Guide recommend for ensuring the traceability and integrity of results generated from user-developed statistical tools in the pharmaceutical industry?", "prev_section_summary": "The section discusses the importance of data integrity in records and databases, specifically focusing on the use of spreadsheet programs and databases in regulated companies. Key topics include measures for ensuring data integrity in spreadsheet programs, the limitations of using spreadsheets as databases, and best practices for data integrity in user-developed and centrally managed PC databases. Entities mentioned include the need for versioning in spreadsheets, the recommendation to use real database engines for better control of data integrity, and the considerations for managing user-developed and centrally managed PC databases in regulated companies.", "excerpt_keywords": "ISPE, Records, Data Integrity, Statistical Tools, Pharmaceutical Industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix d5 records and data integrity\n\n17.4 data integrity for statistical tools\n\nuser developed statistical tools may be used in the same way as a pc databases and have the same data integrity concerns. single use tools supporting investigations should be locked and controlled following completion of the investigation.\n\nuser developed statistical tools may be used repetitively, e.g., to analyze tablet weight distribution for a batch of finished product. template controls should be similar to those for spreadsheet templates:\n\n- the template should be stored in a controlled location, with limited access\n- authorized users should only be able to copy the template to a different directory where it can be used. the code should be inaccessible to users in this location; users should only be able to add and process data.\n- once the result of the analysis has been obtained, it should be protected against unauthorized change\n- the result of the analysis should be traceable to the user who generated it\n- if there are tools to remove or hide statistical outliers from the data set, it may be appropriate to control how and by whom that functionality can be used.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "88d6f346-ea94-47bb-bc80-1f5c0cab012c": {"__data__": {"id_": "88d6f346-ea94-47bb-bc80-1f5c0cab012c", "embedding": null, "metadata": {"page_label": "123", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Managing Records for GXP Compliance: Retention, Archiving, and Migration", "questions_this_excerpt_can_answer": "1. What are the key considerations for a regulated company when managing electronic records to comply with GXP regulations, specifically in terms of record retention and data integrity?\n \n2. How does the ISPE GAMP guide differentiate between \"record retention\" and \"archiving\" in the context of managing electronic records for GXP compliance, and what are the advantages of near-line solutions over off-line solutions for electronic record archiving?\n\n3. According to the ISPE Records and Data Integrity Guide, what measures should be taken to ensure the protection of on-line electronic records in a production database, especially in the context of complying with 21 CFR Part 58.190 for the archiving of preclinical study results?", "prev_section_summary": "The section discusses data integrity practices for user-developed statistical tools in pharmaceutical data analysis, as outlined in the ISPE GAMP\u00ae Guide's Appendix D5. It emphasizes the importance of controlling access and modifications to templates used in statistical analysis, such as tablet weight distribution. The guide recommends storing templates in controlled locations with limited access, protecting analysis results against unauthorized changes, and ensuring traceability of results to the user who generated them. Additionally, it suggests controlling the use of tools to remove or hide statistical outliers from the data set.", "excerpt_keywords": "ISPE, GXP compliance, electronic records, data integrity, archiving"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 121\n\n## records and data integrity appendix o1\n\n|content|page number|\n|---|---|\n|appendix o1 - retention, archiving, and migration|appendix o1|\n\n## appendix o1 - retention, archiving, and migration\n\n### 18.1 introduction\n\nthis appendix describes approaches to managing records in order to comply with gxp regulations in regard to record retention and data integrity. the focus is on issues relating to the choices a regulated company may want to make, including consideration of issues related to migrating electronic records to non-processable formats.\n\nnote: this appendix does not discuss the definition of the retention period for various types of electronic records, which is based on the relevant gxp regulations and company policy. in addition, it is not intended to be a complete guide to gxp compliant data migration or archiving practices.\n\n### 18.2 retention options\n\nthe approach to data retention should be based on regulatory and legal requirements, as well as an assessment of the risk associated with the data format, physical media, and anticipated future use of the data. data management activities (including security, and disaster recovery) should also be considered.\n\nthe terms \"record retention\" and \"archiving\" describe separate issues. typically, archiving involves removing an electronic record from the system that produced it, e.g., a production database. archiving is also an approach to meeting electronic record retention requirements. the selected approach to electronic record retention should meet relevant gxp regulations:\n\n|near-line solutions:|these can archive electronic records invisibly to users. for example, older electronic records may be moved to another database but the electronic records remain accessible through the main application. near-line solutions have the advantage of rapid access.|\n|---|---|\n|off-line solutions:|these involve archiving electronic records on different media (e.g., optical disk or magnetic tape). typically, this will involve more effort to retrieve archived electronic records. off-line solutions usually trade rapid access for less costly storage solutions.|\n\nuse of non-electronic media such as microfilm, microfiche, and paper, or a standard electronic file format, such as pdf, sgml, or xml should also meet relevant gxp regulations. electronic record content and meaning should be preserved. metadata should be considered and may be a critical component of the content and meaning of an electronic record.\n\n### 18.3 protection of records\n\nfor on-line electronic records, logical and physical security measures, including backup, should be applied. system upgrades may require data migration. data migration plans should ensure the integrity of the electronic records in the database, as well as electronic records that have been archived.\n\n21 cfr part 58.190 [32] requires that the results of preclinical studies be archived (and under the control of an archivist) at the completion of the study. in such cases, if the records are to be retained on-line in a production database, measures need to be taken to protect them from alteration in order to comply with this predicate rule.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "86be8a96-629f-4030-bb1f-59e9f261d6d6": {"__data__": {"id_": "86be8a96-629f-4030-bb1f-59e9f261d6d6", "embedding": null, "metadata": {"page_label": "124", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Managing Electronic Records in the ISPE GAMP(r) Guide: Considerations for Archived Records, Record Aging, and Archival Solutions", "questions_this_excerpt_can_answer": "1. What are the recommended practices for managing the longevity and integrity of archived electronic records according to the ISPE GAMP(r) Guide, specifically in terms of media exercise, refresh, and storage conditions?\n\n2. How does the ISPE GAMP(r) Guide suggest handling the technical refresh or conversion of electronic records to ensure compatibility with upgraded production systems, and what considerations should be taken into account during this process?\n\n3. What factors should be considered according to the ISPE GAMP(r) Guide when deciding to archive electronic records, especially in terms of record aging, risk assessment for data migration, and the decision-making process for converting electronic records to a less processable format?", "prev_section_summary": "The section discusses key considerations for managing electronic records to comply with GXP regulations, focusing on record retention, archiving, and migration. It differentiates between record retention and archiving, highlighting the advantages of near-line solutions over off-line solutions for electronic record archiving. The importance of protecting on-line electronic records in a production database, especially in compliance with 21 CFR Part 58.190 for archiving preclinical study results, is also emphasized. Key entities include regulatory and legal requirements, data format, physical media, data management activities, near-line and off-line solutions for archiving, non-electronic media, metadata, logical and physical security measures, system upgrades, and data migration plans.", "excerpt_keywords": "ISPE GAMP(r) Guide, electronic records, record aging, archival solutions, data migration"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix o1\n\nfor archived electronic records, additional considerations include:\n\n- exercise of the media\n- refresh of the media\n- storage conditions\n\nmedia should be used in accordance with its specifications. when refreshing the media, the typical approach is to use new media of the same type, although this is not required. any change in the media should be evaluated for potential risk. magnetic media is prone to degradation over time, so the lifetime of magnetic media can vary. alternative media (e.g., optical) can provide increased durability.\n\narchived electronic records may need to be technically refreshed and/or converted to a new format that is compatible with an upgraded production system, e.g., to support new versions of layered software such as a database engine upgrade.\n\nconsiderations for technically refreshing electronic records include:\n\n- validation activities\n- floating point issues\n- rounding versus truncating\n\nwhen systems are retired, electronic records may still be within their retention period. rendering software may be developed to provide access to these electronic records as an alternative to retaining a costly software license. limitations of the rendering software in regard to processing of the records should be understood. a formal risk assessment associated with the system retirement and rendering software development should be undertaken.\n\n### record aging and risk\n\nfor some records, the associated risks are not constant throughout the record life cycle. this can impact the measures and controls required to safely, effectively, and economically manage the records. for example, data migration planning should include an evaluation of the risks to the records.\n\ndata migration may be required several times for electronic records with a long retention period. data migration can be difficult and expensive. if the risk related to the electronic records is low, migration of the records to a medium with a longer lifespan (e.g., paper, pdf, flat file) may be appropriate. for global information systems, the risk assessment should consider risks related to all sites and jurisdictions with an interest in the data.\n\nwhenever a decision is made to convert electronic records to a less processable format (e.g., from dynamic to static), a risk assessment should be performed and documented. the risk assessment should consider applicable risks for all jurisdictions and for prospective uses of the information. for example, clinical data relating to a mature product that is being phased out of production might be a candidate for conversion to another format. however, if the product is planned for introduction to a new country, or is being considered for a new therapeutic indication, it may be more appropriate to migrate the electronic records and keep them processable.\n\n### archival\n\narchival is a solution for the retention of records that are no longer actively used but need to be retained for business, legal, or regulatory reasons. not all records will be archived, e.g., those with short retention periods.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "608941a5-268b-46bf-81d6-d18c813af6e7": {"__data__": {"id_": "608941a5-268b-46bf-81d6-d18c813af6e7", "embedding": null, "metadata": {"page_label": "125", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Managing Electronic Records and Ensuring Data Integrity in Regulated Environments", "questions_this_excerpt_can_answer": "1. What are the primary reasons for archiving records as outlined in the ISPE GAMP\u00ae Guide, and how does it differentiate between the purposes of archiving and backup of electronic records?\n \n2. According to the ISPE Records and Data Integrity Guide, what risks are associated with not removing electronic records from an active database after they have been archived, and how does this impact the integrity of the archive?\n\n3. In the context of regulated environments, how does the ISPE guide suggest handling hybrid situations where electronic records may not need to be retained in their original processable format throughout the entire retention period, and what are the risk/benefit considerations for data conversion as depicted in figure 18.1?", "prev_section_summary": "The section discusses the management of archived electronic records according to the ISPE GAMP(r) Guide, focusing on considerations such as media exercise, refresh, and storage conditions. It also addresses the technical refresh or conversion of electronic records to ensure compatibility with upgraded production systems, as well as factors to consider when deciding to archive electronic records, including record aging, risk assessment for data migration, and the decision-making process for converting records to a less processable format. Key topics include media specifications, technical refresh considerations, record aging and risk assessment, data migration planning, and archival solutions for retaining records for business, legal, or regulatory purposes. Key entities mentioned include media types, validation activities, rendering software, data migration, and archival solutions.", "excerpt_keywords": "ISPE, GAMP Guide, electronic records, data integrity, archiving"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 123\n\n### records and data integrity appendix o1\n\nreasons for archiving records include:\n\n- freeing storage space (this can apply to disk space for electronic records (or to file cabinets for paper records))\n- protection of the records\n- computer system performance, which may degrade when the database becomes too large\n\narchiving electronic records involves moving the records into an archive and then deleting the records from the active database. the archive is intended as a long-term storage solution (years).\n\narchived electronic records should be removed from an active database otherwise there can be a risk that a change is made to the record in the active database after it has been archived. this can cause differences between the record in the active database and in the archive and this will raise questions regarding the integrity of the archive.\n\n### 18.5.1 backup\n\nbackup of electronic records involves generating a copy of the records. the purpose of a backup is to provide a copy that allows restoration of the database in case of loss or corruption. the retention of backups should be relatively short term (e.g., weeks or months).\n\nbackups should not be used in place archives. it is inefficient to manage years of backups, and it can be difficult to recover individual records should the need arise.\n\n### 18.6 hybrid situations and archives\n\nregulated companies may choose to retain electronic records in formats other than the original record format. the content and meaning of the original electronic record should be preserved and gxp regulations should be met.\n\nit may not be considered necessary to retain electronic records in a processable format throughout the entire retention period. the likelihood that a record may need to be reprocessed may reduce significantly as that record ages. a point may be reached where a decision should be made on the risk/benefits of maintaining an electronic record in a processable format. see figure 18.1.\n\n|figure 18.1: risk/benefit considerations for data conversion|migration of records to a new format|\n|---|---|\n|retention of records in old format with some form of viewer allowing limited manipulation; e.g., sorting or trending|1|\n|retention of records in old format with a viewer with no additional functionality|3|\n|conversion of records to a standard electronic format with long-term viability (e.g., pdf)| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "16353cfb-da8b-4e0f-a54f-ec8d73ff5fd6": {"__data__": {"id_": "16353cfb-da8b-4e0f-a54f-ec8d73ff5fd6", "embedding": null, "metadata": {"page_label": "126", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Retaining Electronic Records and Audit Trails in Regulated Companies", "questions_this_excerpt_can_answer": "1. What factors should regulated companies consider when deciding whether to retain electronic records in a processable form, according to the ISPE GAMP\u00ae Guide's Appendix O1?\n \n2. How does the ISPE Records and Data Integrity Guide suggest handling the migration or transformation of electronic records to ensure compliance with GxP regulations and maintain data integrity, especially concerning audit trails and metadata?\n\n3. What specific criteria does the ISPE Records and Data Integrity Guide recommend for deciding whether to delete the original version of an electronic record after its content and meaning have been preserved and archived?", "prev_section_summary": "The section discusses the reasons for archiving records according to the ISPE GAMP\u00ae Guide, highlighting the need for freeing storage space, protecting records, and maintaining computer system performance. It emphasizes the importance of archiving electronic records by moving them into a long-term storage solution and deleting them from the active database to ensure data integrity. The risks associated with not removing archived records from the active database are also mentioned, as well as the distinction between archiving and backup. Additionally, the section addresses hybrid situations where electronic records may not need to be retained in their original format throughout the entire retention period, providing risk/benefit considerations for data conversion.", "excerpt_keywords": "ISPE, GAMP Guide, electronic records, audit trails, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix o1\n\nregulated companies should evaluate whether retaining electronic records in a processable form is worth the expense of doing so. if the result of the evaluation is a decision to compromise or remove the ability to reprocess, a documented risk assessment supporting the decision should be performed. the risk assessment should include the reasoning behind the choice of which metadata needs to be migrated to the new format.\n\nthe primary consideration should be the effect of a change in electronic record format on risk to patient safety. other considerations may include:\n\n- the ability to demonstrate record integrity in the new format\n- the likelihood that changes to the record would be necessary after conversion\n- future use of the record, including potential need to:\n- sort\n- trend\n- otherwise manipulate the record\n- the difficulty and expense to do any of the described manipulations, if necessary\n- availability of the electronic record to regulators\n- regulated company risk tolerance related to a potential regulatory request\n\nlower cost and simpler technical options, e.g., paper, may be adequate solutions, depending on the impact on the record (e.g., the criticality of metadata for understanding the record) and the requirements for processing. where an assessment has determined that the audit trail needed to be retained, and if there is any metadata that is needed to support electronic record integrity, the metadata should remain part of the electronic record. metadata may include:\n\n- date of edit of the electronic record\n- identity of the editor\n- previous values\n\nthe original version of an electronic record may be deleted if the content and meaning of the original record are preserved and archived, and gxp regulations are met. where migration or transformation is not required to keep the electronic record in a processable format, archival of the electronic record may be the lowest risk approach.\n\n## 18.7 audit trail considerations\n\naudit trails should be considered part of records. data migration activities should retain audit trail information. a record of the changes to an electronic record prior to the conversion of its format should be preserved, if possible. this should occur even if the electronic record is converted to pdf or paper.\n\na decision not to migrate an audit trail should be justified, based on risk, and should be documented. for example:\n\n- if the audit trail is integral to understanding or ensuring the integrity of the record, it should be part of the migrated record, e.g., changing a glp or gmp laboratory test result based on reintegration of a chromatogram.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d01520cc-d5c5-40b8-bddb-9ce0742bd721": {"__data__": {"id_": "d01520cc-d5c5-40b8-bddb-9ce0742bd721", "embedding": null, "metadata": {"page_label": "127", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Managing Electronic Records in Alternative Systems for Data Integrity and Compliance: Best Practices and Strategies", "questions_this_excerpt_can_answer": "1. What are the considerations for managing electronic records in alternative systems with respect to data integrity and compliance, as outlined by the ISPE GAMP(r) guide?\n \n2. How does the ISPE Records and Data Integrity Guide address the preservation of content and meaning when migrating electronic records to higher-level systems, and what are the recommended practices for ensuring data integrity during this process?\n\n3. According to the ISPE Records and Data Integrity Guide, what factors should regulated companies evaluate when considering the transfer of electronic records to an alternative system for the purpose of reprocessing data or exporting data back to the originating system?", "prev_section_summary": "The section discusses best practices for retaining electronic records and audit trails in regulated companies, as outlined in the ISPE Records and Data Integrity Guide. Key topics include factors to consider when deciding whether to retain electronic records in a processable form, handling the migration or transformation of electronic records to ensure compliance with regulations and maintain data integrity, and criteria for deciding whether to delete the original version of an electronic record after preservation and archiving. Entities mentioned include regulated companies, electronic records, metadata, audit trails, risk assessment, patient safety, record integrity, regulators, and GxP regulations.", "excerpt_keywords": "ISPE, GAMP, electronic records, data integrity, compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 125\n\n## records and data integrity appendix o1\n\nif the audit trail is not required by gxp regulation and the data was used only for unregulated purposes, e.g., statistical process optimization within validated parameters, or for workflow tracking; therefore, it has low gxp impact.\n\n## alternative systems\n\nelectronic records may be collected and managed on different systems. for example, it may be appropriate to utilize superior data management capabilities (e.g., audit trailing, consolidated backup) in a higher-level system, rather than trying to build those capabilities into several laboratory instruments (i.e., the originating systems).\n\nif uses and manipulations of an electronic record are intended to be in the higher-level system:\n\n- the content and meaning of the electronic record should be preserved\n- manipulation of the original raw data file through the originating system should be prevented\n\nsystems may allow the original data file from the laboratory instrument to be retained and managed in the higher-level system. this can be advantageous because it can allow:\n\n- greater control of the data offered in the higher-level system\n- restoring of the data to the originating system for reprocessing while retaining the original data file\n\nsystems may use proprietary data formats. the content and meaning of the electronic record may not be preserved when these formats are converted to new formats. in some cases, it may be possible to manage data files through the higher-level system; however, the electronic records may be only viewable in the originating system. an assessment should be performed to determine whether processability is critical. the need for this may decrease as the record ages.\n\nthe content and meaning of migrated records should be preserved. this typically involves either validating the conversion or verifying the accuracy of the new version.\n\nregulated companies should understand the risks, as well as benefits, of a solution that places records in an alternative system. eu gmp chapter 6, section 6.9 [33] states:\n\n\"some kinds of data (e.g. test results, yields, environmental controls) should be recorded in a manner permitting trend evaluation.\"\n\nif a regulated company interprets this as requiring the ability to reprocess the data, transferal to a higher-level system may not be appropriate. if transferal to a higher-level system is considered, that system should be able to support reprocessing data or exporting data back to the originating system.\n\nregulated companies should consider potential future needs for manipulating records, when evaluating the risk of performing anticipated analyses without the ability to reprocess data. the risks and costs associated with validating or verifying data migration should also be considered.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0a0e8534-984b-4b56-9b9b-304eb4404cb7": {"__data__": {"id_": "0a0e8534-984b-4b56-9b9b-304eb4404cb7", "embedding": null, "metadata": {"page_label": "128", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Converting Electronic Records to Alternative Formats and Media Hybrids: Considerations and Best Practices", "questions_this_excerpt_can_answer": "1. What are the primary considerations a regulated company should take into account when deciding to convert electronic records to alternative formats or media hybrids, especially in the context of drug product distribution records?\n \n2. What risks are associated with moving electronic records from one repository to another for archiving purposes, and what method is suggested to ensure the integrity of the electronic record during migration?\n\n3. How does the ISPE GAMP\u00ae Guide: Appendix O1 view the conversion of electronic records to PDF format in terms of data integrity, product quality, and patient safety, and what are the perceived advantages and disadvantages of using PDF as an alternative format for record-keeping?", "prev_section_summary": "The section discusses managing electronic records in alternative systems for data integrity and compliance, as outlined by the ISPE GAMP(r) guide. Key topics include considerations for utilizing higher-level systems for data management, preserving content and meaning during record migration, preventing manipulation of raw data files, and assessing the risks and benefits of transferring records to alternative systems. Entities mentioned include regulated companies, electronic records, proprietary data formats, and the need for validating or verifying data conversion.", "excerpt_keywords": "ISPE, Records, Data Integrity, Electronic Records, PDF"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix o1\n\nrecords and data integrity\n\n18.9 converting electronic to alternative format or alternative media hybrids\n\n|18.9.1|considerations for conversion|\n|---|---|\n| |the primary driver for any decision to convert electronic records to other formats should be business need. appropriate points for considering conversion include:|\n| |- creation of the record|\n| |- the point at which a record is to be archived|\n| |- at system upgrade, especially if electronic record conversion is needed|\n| |- at system replacement when contemplating data migration to the new system|\n| |- at system retirement, especially if electronic record conversion or development of rendering software is needed|\n| |- the point at which a media refresh is needed|\n| |for example, for drug product distribution records, the speed of response is critical for dealing with recall situations. a regulated company may decide that distribution records are not well suited for immediate conversion to a non-processable format, as the ability to search and access the distribution records quickly is usually suited to a computerized system. the risk would be substantially lower, however, after the expiration of a lot of drug product, so conversion to paper at that point might be justified.|\n\n18.9.2 changing repositories without altering format\n\nthere are risks associated with moving electronic records from one repository to another, e.g., for archiving. risks may include:\n\n- media degradation\n- accidental loss\n- failure to retain software capable of viewing the records\n\nmethods such as checksum verification help to ensure that migration of an electronic record is complete.\n\n18.9.3 risk assessment for conversion\n\ndecisions to convert electronic records to an alternative media, format, or repository should be justified. the risk assessment should demonstrate that there is no unacceptable risk to data integrity, product quality, and patient safety.\n\nwhile pdf is an electronic format, and does offer some possibility to manage records using audit trails and digital signature, it is considered an alternative format because conversion to pdf generally sacrifices the ability to process the data. however, pdf carries the advantage of being able to execute searches within documents (if not scanned), and depending on how the files are stored, also may offer searchability on the documents themselves. this should be considered when selecting to what format to convert records.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2b3f3311-511b-4e6d-8a0b-0b94e0d80812": {"__data__": {"id_": "2b3f3311-511b-4e6d-8a0b-0b94e0d80812", "embedding": null, "metadata": {"page_label": "129", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Risk Assessment and Data Integrity in Electronic Records Management: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the risk assessment process differ between small to moderate sized systems and larger, more complex systems in the context of records and data integrity, according to the ISPE Records and Data Integrity Guide?\n\n2. What specific considerations and potential effects should be evaluated when converting regulated records to another format, especially for higher impact records, as outlined in the ISPE Records and Data Integrity Guide?\n\n3. In the scenario where accuracy and completeness of records in a drug safety database could be compromised by conversion, what are the potential risks to patient safety mentioned in the ISPE Records and Data Integrity Guide, and how do these risks compare to the impact on records in a training database?", "prev_section_summary": "The section discusses the considerations and best practices for converting electronic records to alternative formats or media hybrids, particularly in the context of drug product distribution records. Key topics include the primary considerations for conversion, risks associated with moving electronic records between repositories, methods to ensure data integrity during migration, and the risk assessment for conversion. The section also explores the ISPE GAMP\u00ae Guide's view on converting electronic records to PDF format, highlighting the advantages and disadvantages of using PDF as an alternative format for record-keeping. Key entities mentioned include regulated companies, electronic records, alternative formats, media hybrids, PDF format, data integrity, product quality, patient safety, and risk assessment.", "excerpt_keywords": "ISPE, Records, Data Integrity, Risk Assessment, Electronic Records Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\nwhere appropriate, the risk assessment process should be based on groups of related records:\n\n- for a small to moderate sized system (e.g., a chromatography data system), it may be possible to evaluate all of the records as a single group.\n- for larger more complex systems (e.g., an erp), several groups of records should be evaluated independently.\n\nwhen considering conversion of regulated records to another format, risks, and requirements should be considered for higher impact records. see table 18.1.\n\nfor low impact records the approach to archiving should follow good it practices. for higher impact records, risks and requirements such as those in table 18.2 should be evaluated.\n\nrisk assessments should consider the way in which electronic records are accessed and used. regulated companies should consider potential effects (see tables 18.1 and 18.2) in the context of each unique set of electronic records.\n\nfor example, if accuracy and completeness of records in a drug safety database could be compromised by conversion, and the converted record could then be interpreted incorrectly, there could be significant risk to patient safety, based on erroneous medical conclusions. the same occurrence to records in a training database would have a less severe impact.\n\n|risk factors and requirements|considerations|potential effects|\n|---|---|---|\n|conversion may change the accuracy and completeness of the record in a manner that would affect the interpretation of the data|if the converted record is considered the original data, the possibility of changing the interpretation of the data would be unacceptable|interpretation of the converted record leads to a different conclusion than before conversion|\n|users may have to execute a rapid search of the data across records|if rapid retrieval is necessary, e.g., to support a product recall, conversion may be inappropriate, as cross record searching is far easier using database technology|unable to rapidly search, inadequate/incomplete searches|\n|users may have to execute large or frequent searches on the records|frequent or large searches introduce increased probability that the searches will be incomplete|spend inordinate resources on searches, inadequate/incomplete searches, unable to execute effective search|\n|users may have to search the records based on a wide range of keys|most filing systems for non-electronic records have limited searchable keys|spend inordinate resources on searches, inadequate/incomplete searches, unable to execute effective search|\n|retention of original record after conversion to an alternative format|why retain the original? how will it be kept consistent with the converted copy?|inconsistency of records, confusion/inaccuracy|\n|record may have to be modified after it is converted to an alternative format|changes may be harder to execute and to track in the alternative format|audit trail inadequate, external audit trail may be required|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e9a78a30-404a-439e-9818-5eec87992c68": {"__data__": {"id_": "e9a78a30-404a-439e-9818-5eec87992c68", "embedding": null, "metadata": {"page_label": "130", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Electronic Record Accessibility and Integrity Guidelines", "questions_this_excerpt_can_answer": "1. What are the specific considerations that need to be addressed when ensuring that employees have ready access to electronic records in a new format, especially in terms of geographic or technical challenges?\n\n2. How does the retention of an audit trail as part of an electronic record in an alternative format impact the integrity and size of the record archive, according to the guidelines provided in the ISPE Records and Data Integrity Guide?\n\n3. What are the potential risks and effects associated with the preservation of electronic signatures when converting electronic records into an alternative format, as outlined in the Electronic Record Accessibility and Integrity Guidelines?", "prev_section_summary": "The section discusses the importance of risk assessment in electronic records management for ensuring data integrity. It highlights the differences in the risk assessment process for small to moderate sized systems compared to larger, more complex systems. The excerpt emphasizes the considerations and potential effects of converting regulated records to another format, especially for higher impact records. It also addresses the risks to patient safety that may arise from compromised accuracy and completeness of records in a drug safety database. Additionally, it outlines various risk factors and requirements to be evaluated during the risk assessment process, such as changes in data interpretation, search capabilities, retention of original records, and modifications after conversion. The section underscores the need for thorough risk assessments to mitigate potential risks and ensure the integrity of electronic records.", "excerpt_keywords": "Electronic Record, Data Integrity, Guidelines, Audit Trail, Risk Assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n|risk factors and requirements|considerations|potential effects|\n|---|---|---|\n|employees who need it do not have ready access to the electronic record in the new format in order to perform their job responsibilities|if they are expected to use the alternative format electronic record, it needs to be accessible. this can be problematic due to geographic or technical factors (e.g., no access to a required reader).|- inefficiency\n- actions taken based on insufficient data\n|\n|an audit trail needs to be retained as part of the electronic record in the alternative format|- is the electronic record history retained in the audit trail critical to the value of the record?\n- is the audit trail integral to electronic record integrity?\n- is an audit trail required by a gxp regulation?\n- an audit trail in an alternative format may double (or worse) the size of each record. (this may in fact be a driver for moving records from the database to archive.)\n|- audit trail inadequately shows subsequent changes, with potential reduction in record integrity\n- size of archive may become unwieldy if audit trail retention is handled ineffectively\n- large database size may lead to performance problems\n|\n|an electronic signature is associated with the electronic record|- is evidence of the approval still in the new version?\n- is the alternative format adequate evidence of authenticity?\n- is the link between electronic signature and electronic record preserved?\n|- evidence of timely approval is compromised or lost\n- hybrid manifestation of electronic signature loses legal meaning/weight\n- linkage of electronic record with signature is broken\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a64e07fe-146e-4017-9281-f28ff291ece8": {"__data__": {"id_": "a64e07fe-146e-4017-9281-f28ff291ece8", "embedding": null, "metadata": {"page_label": "131", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Maintaining Data Integrity in Records Conversion to Alternative Media: Key Considerations and Best Practices", "questions_this_excerpt_can_answer": "1. What are the specific challenges and considerations associated with maintaining data integrity when converting records to alternative media, particularly in the context of rapid data retrieval requirements such as during a product recall scenario?\n\n2. How does the retention of original electronic records after their conversion to alternative media impact the consistency of records, and what are the key considerations for ensuring consistency between the original and converted copies?\n\n3. What are the implications of retaining an audit trail as part of the record when converting to alternative media, especially in terms of regulatory compliance with GxP regulations, and how does this affect the size and manageability of the archive?", "prev_section_summary": "The section discusses the considerations and potential effects related to ensuring employees have access to electronic records in a new format, retaining audit trails in alternative formats, and preserving electronic signatures during record conversions. Key topics include access challenges, audit trail retention impact on record integrity and archive size, and risks associated with electronic signature preservation. Key entities mentioned are employees, audit trails, electronic signatures, and electronic records.", "excerpt_keywords": "Records, Data Integrity, Conversion, Alternative Media, Audit Trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n|risk factors and requirements|considerations|potential effects|\n|---|---|---|\n|users may have to execute a rapid search of the data across records|if rapid retrieval is necessary, e.g., to support a product recall, search capabilities on the new media may be limited and restoration of the records to a searchable platform may cause delay|unable to rapidly search. this may be partially mitigated by development of emergency procedures to eliminate delays that are purely administrative in nature.|\n|users may have to execute large, complex, or frequent searches|search capabilities on the new media may be limited. frequent restoration of archived data would be resource intensive and expensive.|spend inordinate resources on searches|\n|retention of original electronic record after conversion to an alternative media|* why retain the original? * how will it be kept consistent with the converted copy?|inconsistency of records|\n|record may have to be modified after it is committed to alternative media|changes may be harder to execute and to track on the new media|* required changes not executed or not executed in a timely fashion * audit trail inadequate|\n|an audit trail needs to be retained as part of the record|* is the electronic record history retained in the audit trail critical to the value of the record? * is the audit trail integral to electronic record integrity? * is an audit trail required by gxp regulation? * depending on architecture of the audit trail, changes after commitment to different media may multiply electronic record size several-fold.|size of archive may become unwieldy if audit trail retention is handled ineffectively|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "45067ff5-7d8c-461e-a6c5-df6ec7bb1f8d": {"__data__": {"id_": "45067ff5-7d8c-461e-a6c5-df6ec7bb1f8d", "embedding": null, "metadata": {"page_label": "132", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Exploring the Absence of Content: A Study on Blank Canvases\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the significance of the document titled \"Exploring the Absence of Content: A Study on Blank Canvases\" within the ISPE Records and Data Integrity Guide?**\n - This question seeks to understand the relevance or contribution of a document with a seemingly unrelated title to the broader topic of records and data integrity in the pharmaceutical industry, as guided by ISPE (International Society for Pharmaceutical Engineering).\n\n2. **How does the document \"Exploring the Absence of Content: A Study on Blank Canvases\" fit into the training or educational materials provided by the PharmaWise Engineer program, specifically within the context of CSV (Computer System Validation) & Data Integrity?**\n - This question aims to explore the pedagogical approach or the rationale behind including a document with a title that suggests a focus on conceptual or abstract themes within a technical and regulatory-oriented training program.\n\n3. **Given the creation and last modified dates of the document are in the future (2024), what advancements or changes in the field of records and data integrity does the document anticipate or reflect?**\n - This question is interested in uncovering any forward-looking insights, trends, or predictions that the document might discuss, especially considering its creation in a future context where new developments in data integrity and regulatory compliance might have emerged.\n\nThese questions leverage the unique aspects of the provided context, such as the intriguing title of the document, its placement within a technical guide and educational program, and its future creation date, to seek information that would be uniquely addressed within this specific document.", "prev_section_summary": "The section discusses the challenges and considerations associated with maintaining data integrity when converting records to alternative media, particularly in the context of rapid data retrieval requirements such as during a product recall scenario. Key topics include the impact of retaining original electronic records after conversion, ensuring consistency between original and converted copies, implications of retaining an audit trail, and potential effects such as delays in search capabilities, resource-intensive searches, inconsistency of records, difficulty in executing and tracking changes, and the size and manageability of the archive. Entities mentioned include users, electronic records, alternative media, audit trail, GxP regulations, and the need for emergency procedures to eliminate administrative delays.", "excerpt_keywords": "ISPE, Records, Data Integrity, Blank Canvases, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6bcdb43b-10e8-4484-8ecd-611a0c8d860e": {"__data__": {"id_": "6bcdb43b-10e8-4484-8ecd-611a0c8d860e", "embedding": null, "metadata": {"page_label": "133", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Managing Paper Records to Ensure Data Integrity", "questions_this_excerpt_can_answer": "1. What are the key principles for managing paper records to ensure data integrity according to the ISPE GAMP\u00ae guide, specifically within the context of regulatory requirements and quality management systems (QMS)?\n \n2. How does the ISPE GAMP\u00ae guide recommend handling the creation, review, approval, and management of documents and procedures to maintain data integrity in paper records and hybrid situations?\n\n3. What specific guidelines does the ISPE GAMP\u00ae guide provide for the indexing, identification, protection, and version control of documents to prevent unauthorized or inadvertent changes, in the context of ensuring data integrity for paper-based records?", "prev_section_summary": "The section discusses a document titled \"Exploring the Absence of Content: A Study on Blank Canvases\" within the ISPE Records and Data Integrity Guide. It raises questions about the significance of the document in relation to records and data integrity, its inclusion in the PharmaWise Engineer program, and the potential insights it may offer regarding future advancements in the field. The section emphasizes the unique aspects of the document, such as its title, placement, and creation date, to explore specific information that may be addressed within this context.", "excerpt_keywords": "ISPE GAMP, paper records, data integrity, regulatory requirements, quality management systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 131\n\n## records and data integrity appendix o2\n\n### appendix o2 - paper records and hybrid situations\n\n### 19.1 paper records\n\n#### 19.1.1 introduction\n\nthe management of paper based records should support data integrity. the management system for paper records should be designed to meet regulatory requirements and should be an integral part of the qms. paper records should be controlled and managed according to the principles of alcoa+. see section 1.5.4.\n\nin this appendix, the term \"document\" is used to reflect common usage for paper, and is aligned in approach and terminology with eudralex: rules governing medicinal products in the european union; volume 4 good manufacturing practice - chapter 4: documentation [6], where it states under \"principles\" that \"there are two primary types of documentation used to manage and record gmp compliance: instructions (directions, requirements) and records/reports.\"\n\nthe management of paper records is well established and well described in applicable regulations and regulatory guidance. this appendix provides a high-level overview of such expectations, but is intended to be neither prescriptive nor exhaustive. the relevant applicable regulations and guidance should be consulted when defining and designing a management system for paper records.\n\n#### 19.1.2 overview\n\nprocedures should be established for:\n\n- creation review and approval for use of documents and procedures (including instructions, records, and templates)\n- management of copies of documents for routine use, ensuring copies of documents and forms are issued and reconciled for use in a controlled and traceable manner\n- completion of paper based documents, including identification of individuals, data entry formats, and how amendments are recorded\n- routine review of completed documents for accuracy, authenticity and completeness\n- filing, retrieval, retention, archive, and disposal of documents\n\ndetailed specific requirements depend on the nature of the document, and any applicable specific regulatory requirements.\n\n#### 19.1.3 management\n\nan index of all documents (including template documents) should be maintained. all documents should be uniquely identifiable (including a version number) and should be checked, approved, signed, and dated, as appropriate [13]. documents should be protected from unauthorized or inadvertent changes. the reproduction of document copies should not allow any error to be introduced through the reproduction process [6]. different versions of templates should be maintained using change control.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "64c8962f-e9b1-48d9-87f5-e5c326033c72": {"__data__": {"id_": "64c8962f-e9b1-48d9-87f5-e5c326033c72", "embedding": null, "metadata": {"page_label": "134", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Data Integrity and Document Control in GAMP Compliance: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What specific guidelines does the ISPE GAMP\u00ae guide provide for the management and destruction of obsolete documents to ensure data integrity and compliance with applicable laws?\n \n2. How does the ISPE GAMP\u00ae guide recommend handling handwritten entries in documents to maintain their integrity, including the treatment of unused fields and the process for making corrections?\n\n3. What procedures does the ISPE GAMP\u00ae guide outline for the distribution, storage, and protection of controlled documents to prevent unauthorized access and ensure that only the current approved versions are used?", "prev_section_summary": "The section discusses the management of paper records to ensure data integrity according to the ISPE GAMP\u00ae guide. Key topics include the principles of managing paper records, creation, review, approval, and management of documents, indexing, identification, protection, and version control of documents. Entities mentioned include regulatory requirements, quality management systems, creation of documents, review and approval processes, completion of paper-based documents, filing, retrieval, retention, archive, and disposal of documents, and the importance of maintaining document integrity and version control.", "excerpt_keywords": "ISPE GAMP, data integrity, document control, obsolete documents, handwritten entries"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: records and data integrity\n\nappendix o2 updated versions should be distributed in a timely manner. procedures should ensure that only the current approved version is available for use [13]. procedures should cover the generation and management of controlled copies. documents should be stored in an appropriately secure location in a traceable and accessible manner for the required retention period. documents should be protected from damage, destruction, or unauthorized alteration [13]. obsolete documents should be archived for the required retention period and access to them restricted. any issued and unused circulated copies of superseded documents should be withdrawn and destroyed [13]. documents that have exceeded their retention period should also be destroyed, taking into account all applicable local and international laws and any ongoing litigation.\n\nuse\n\ndocument design should make clear what data is to be recorded [13], and provide sufficient space for manual data entries. the use of uncontrolled documents and the use of temporary recording practices should be prohibited [13], e.g., recording data on note paper prior to transferring it to the official record, such as a laboratory notebook or batch record. handwritten entries should be:\n\n- contemporaneous\n- made by the person who executed the task\n- clearly attributable to the individual\n- indelible\n- clear\n- legible\n\nunused, blank fields within documents should be marked as such (e.g., not applicable or n/a), dated and signed [13]. date formats should be consistent, clear and unambiguous. corrections should be made in such a way as:\n\n- not to obscure the original value (e.g., struck through with a single line)\n- indelible\n- dated\n\nthe person making the correction identified (e.g., by initial). where appropriate, the reason for the alteration should be recorded. a process for review and/or verification of records as required by the applicable regulations should be established. the recording of such reviews and/or verifications should follow good documentation practices.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c0137e30-77da-4aed-af8b-5b426fffc987": {"__data__": {"id_": "c0137e30-77da-4aed-af8b-5b426fffc987", "embedding": null, "metadata": {"page_label": "135", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Strategies for Managing Hybrid Situations in Records and Data Integrity\"", "questions_this_excerpt_can_answer": "1. What specific procedural controls are recommended for ensuring the long-term integrity of data and maintaining links between hybrid components in a regulated environment, according to the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide suggest regulated companies should manage the primary record in hybrid situations, including the establishment and verification of controls?\n\n3. What are the detailed steps or procedures outlined in the ISPE Records and Data Integrity Guide for managing data retention and ensuring data integrity in hybrid situations, including requirements for retention periods, legibility, and retention of associated metadata?", "prev_section_summary": "The section discusses guidelines provided by the ISPE GAMP\u00ae guide for ensuring data integrity and document control in GAMP compliance. Key topics include the management and destruction of obsolete documents, handling handwritten entries, distribution and storage of controlled documents, and procedures for maintaining document integrity. Entities mentioned include procedures for document distribution, storage, protection, and destruction, as well as guidelines for making corrections to handwritten entries and ensuring the clarity and integrity of recorded data.", "excerpt_keywords": "ISPE, GAMP, data integrity, hybrid situations, procedural controls"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 133\n\n## records and data integrity appendix o2\n\n### 19.2 hybrid situations\n\n19.2.1 introduction\n\npaper and electronic record and signature components can coexist (i.e., a hybrid situation) as long as wider gxp requirements are met and the content and meaning of those records are preserved [16].\n\na formal risk assessment should be performed to ensure that suitable controls are in place and that all required data is retained. this guide encourages a move away from hybrid situations wherever practical.\n\nwhere other options are available, hybrid situations should be avoided, as the long-term integrity of data and the link between hybrid components, e.g., paper signatures and electronic records, can be difficult to ensure.\n\nprocedural controls may be needed to ensure the long-term integrity of data and to maintain links between hybrid components.\n\nthe presence of interfaces between paper based processes and electronic records and data processes increases the risk to data integrity. regulated companies should plan to replace systems requiring hybrid situations.\n\n19.2.2 controls for managing hybrid situations\n\nregulated companies should define and document which hybrid component is the primary record.\n\nsuitable controls should be established and verified. these may include sops that define the process of controlling the signed paper record, and for making modifications to the paper and electronic records, if needed. procedures should prevent the use of incorrect or out of date versions of records.\n\nexamples include:\n\n- procedure for signing paper copies: procedures should define the creation, review, and approval of the paper record, including attached printouts of electronic records. procedures should describe the link between the electronic record and the signed paper copy. the signed paper copy should be defined as the primary record.\n- procedure for data retention/data integrity: describing how data is managed and retained including:\n- retention period requirements\n- legibility for the required period\n- retention of date and time stamps\n- retention of data about the user who created the record (user id)\n- retention of associated metadata needed to preserve gxp content and meaning\n- changes recorded using change control and audit trail\n- access control: this may be physical access control to the document control center for paper records.\n- change control: changes managed on paper with change sop.\n- audit trail: no electronic audit trail is available; therefore, a paper audit trail could be maintained in the record, or maintained separately and linked by reference to the record.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "425a66ef-db1d-4201-9ff8-96df1ed57937": {"__data__": {"id_": "425a66ef-db1d-4201-9ff8-96df1ed57937", "embedding": null, "metadata": {"page_label": "136", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity in the Transfer and Management of Chromatography Data and Spreadsheets: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What are the recommended practices for transferring data from old systems to ensure long-term storage and integrity according to the ISPE GAMP\u00ae Guide's Appendix O2?\n \n2. How does the ISPE GAMP\u00ae Guide suggest handling the retention and control of chromatography data and spreadsheets to maintain data integrity, especially in hybrid situations where both electronic and paper records are used?\n\n3. What specific guidelines does the ISPE GAMP\u00ae Guide provide for managing changes in parameters and conducting risk assessments related to the reprocessing of samples and the transfer of data to different formats or media for long-term storage?", "prev_section_summary": "The section discusses the management of hybrid situations in records and data integrity according to the ISPE Records and Data Integrity Guide. Key topics include the introduction to hybrid situations, the need for a formal risk assessment, the importance of suitable controls, and the recommendation to move away from hybrid situations whenever possible. It also outlines specific procedural controls for managing hybrid components, such as defining the primary record, establishing controls for signing paper copies, managing data retention and integrity, and implementing access control, change control, and audit trail procedures. The section emphasizes the challenges of ensuring the long-term integrity of data and maintaining links between paper and electronic components in regulated environments.", "excerpt_keywords": "ISPE, GAMP, Data Integrity, Chromatography, Spreadsheets"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix o2 records and data integrity\n\ntransfer of data from old systems: old systems may have retained data overwritten every time the system is used or after a period of time. data for long term storage of the record should be stored by printing out and signing. formal risk assessments should be used to determine what forms the complete record for transfer; this may include metadata.\n\npaper records which include attached electronic data may be retained as paper or by scanning into a separate electronic system for long term storage. these records require the same controls as described in this section, plus additional controls to be applied to the electronic storage system.\n\nif a record in electronic format is used to support regulated activities or decisions, then a printed copy cannot be considered as the primary record. if only the paper copy is used, it may be possible to consider it as the primary record. in such cases, it may be possible to delete the electronic record.\n\nnote: this reasoning does not apply to all records. while it may be acceptable for a validation report, it may not be appropriate for records where metadata is crucial to the integrity and meaning of the record, e.g., for chromatography, or for spreadsheets that manipulate data such as performing calculations.\n\n### practical difficulties with hybrid situations\n\nexamples provided in this section of the appendix illustrate some practical difficulties with establishing and maintaining hybrid situations.\n\nchromatography data\n\nusual practice has been to print out the chromatogram and approve it by applying signatures to the paper record and attaching the printout to the batch record. this does not capture all the required raw data to enable the sample to be rerun. additional information related to the analytical method including setup, solvent gradient, base line noise suppression information, etc., should be retained.\n\nthe quantity and complexity of the raw data means that the data needs to be retained in the original computer system to ensure that samples can be reprocessed and compared. it cannot be managed on paper.\n\nthe management of chromatography data should be defined in a formal procedure that should include the ability to reprocess samples and compare all raw data. the scope of reprocessing includes reintegration of chromatograms.\n\nresetting parameters describing changes are part of normal operation and should be recorded and controlled by change control and audit trail (manual intervention/integration) describing the retention of all raw data.\n\nrisk assessment should look at the risks related to reprocessing samples and comparing the raw data. any change of format and media should also be subject to risk assessment, as the data may be transferred to another system for long term storage.\n\nspreadsheets\n\nrecording production and laboratory data and calculating results is frequently performed using a spreadsheet. the spreadsheet can be used to record the data and can be configured to manipulate the data to obtain a result. these results can be printed out, reviewed, signed, and dated. the printout is retained in the paper batch record or laboratory record; therefore, the paper printout should be regarded as the primary record.\n\nthe spreadsheet should be regarded as the original data and adequately protected, unless the data has been stored separately and securely.\n\nspreadsheet records and calculations using templates should be verified and controlled as described in ispe gamp(r) 5.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "70f13571-b355-456c-a567-9138bce04645": {"__data__": {"id_": "70f13571-b355-456c-a567-9138bce04645", "embedding": null, "metadata": {"page_label": "137", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Compliance in Record Management and Equipment Use: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific considerations should be made when using spreadsheets for managing and retaining records in the context of data integrity and compliance, according to the ISPE GAMP\u00ae Guide?\n \n2. How does the ISPE Records and Data Integrity Guide suggest handling electronic data collected by production equipment that cannot directly record confirmation of key processing steps or manage primary or fixed data changes?\n\n3. What are the recommended procedures for managing risks associated with older production equipment that may offer limited access control features, as outlined in the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the ISPE GAMP\u00ae Guide's recommendations for ensuring data integrity in the transfer and management of chromatography data and spreadsheets. Key topics include transferring data from old systems, handling hybrid situations with electronic and paper records, managing chromatography data, and using spreadsheets for recording and calculating results. Entities mentioned include metadata, risk assessments, raw data retention, reprocessing samples, change control, audit trails, and verification of spreadsheet records.", "excerpt_keywords": "ISPE, GAMP, data integrity, compliance, record management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 135\n\n### records and data integrity appendix o2\n\nthe use of spreadsheets should be described in a formal procedure and the risks related to managing and retaining these records should be formally assessed.\n\nproduction equipment\n\nproduction equipment may collect electronic data which is printed out for subsequent review and approval as part of a paper batch record.\n\nit may not be possible to record confirmation of key processing steps or record the management of primary data or fixed data. changes may need to be recorded separately on the batch record system, e.g., changes made to:\n\n- machine settings\n- set points\n- processing instructions\n- warning/action alarms\n\na formal risk assessment should consider risks related to managing the retained data, including any changes to primary data.\n\nthe data may be transferred to another system for long term storage. any change of format and media should also be subject to risk assessment.\n\nsystems may retain information in a fixed size buffer, and information may be overwritten, once it becomes older information, or once the buffer is full. critical data in such systems should be transferred to another system or to paper records to prevent its loss.\n\nolder equipment may provide only physical access control or limited logical access, e.g., only one id and password for all users. a formal risk assessment should consider risks related to unauthorized access/changes. additional physical controls, procedures, and training may be required.\n\n### use of forms to enforce procedures\n\nin primarily paper based or hybrid situations, the use of forms should be considered to capture data, and to ensure that all data required for each step of the process is recorded. forms should include references to the data, standards, or sops that they support, to enable linkage to associated electronic records, and to assist with archiving.\n\nforms can be retained on paper or scanned into an electronic system for long term storage. any change of format and media should be subject to formal risk assessment.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "715ebe03-a2e2-4e58-9286-0d4ec4829de4": {"__data__": {"id_": "715ebe03-a2e2-4e58-9286-0d4ec4829de4", "embedding": null, "metadata": {"page_label": "138", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: Exploring the Absence of Content in Art and Design\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the ISPE Records and Data Integrity Guide as stored in the PharmaWise Engineer project on Google Drive?**\n - This question is specific to the document's digital footprint within a particular storage solution, information that is uniquely detailed in the provided context.\n\n2. **What are the creation and last modification dates of the document titled \"Blank Canvas: Exploring the Absence of Content in Art and Design\" found within the ISPE Records and Data Integrity Guide?**\n - The context uniquely provides the creation and last modification dates of a document, which seems to have a misleading title or content discrepancy based on the expected content of the guide versus the title provided.\n\n3. **Why might a document within a pharmaceutical development and data integrity guide be titled \"Blank Canvas: Exploring the Absence of Content in Art and Design\"?**\n - This question probes into the reasoning or significance behind the naming of a document, which appears to be out of context within a guide focused on records and data integrity in the pharmaceutical industry. The answer to this question would likely delve into the document's content or purpose, which is uniquely positioned within the provided context.\n\nThese questions leverage the specific details provided in the context, such as file metadata and the apparent discrepancy between the document title and the expected content of the guide, to formulate inquiries that are uniquely answerable by the given information.", "prev_section_summary": "The section discusses the use of spreadsheets for managing and retaining records in the context of data integrity and compliance, as outlined in the ISPE GAMP\u00ae Guide. It also addresses the handling of electronic data collected by production equipment, the risks associated with older production equipment, and the use of forms to enforce procedures in paper-based or hybrid environments. Key topics include formal risk assessments, managing retained data, transferring data to other systems, access control considerations, and the use of forms for data capture and archiving. Key entities mentioned include production equipment, spreadsheets, electronic data, batch records, machine settings, set points, processing instructions, warning/action alarms, forms, standards, and SOPs.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, PharmaWise Engineer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5d09bb37-86d1-4fe5-9e67-f4e38b31be7a": {"__data__": {"id_": "5d09bb37-86d1-4fe5-9e67-f4e38b31be7a", "embedding": null, "metadata": {"page_label": "139", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Integrity Guidance and Regulations in the Pharmaceutical Industry: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What are the key references for understanding the regulatory and guidance framework surrounding data integrity and electronic records in the pharmaceutical industry as of 2024?\n\n2. How has the International Society for Pharmaceutical Engineering (ISPE) contributed to the guidance on compliant GxP computerized systems and data integrity, according to the most recent publications up to 2024?\n\n3. What are the specific guidelines and draft versions provided by regulatory bodies such as the MHRA, FDA, and PIC/S for ensuring data integrity and compliance with Good Manufacturing Practices (GMP) in the pharmaceutical sector as detailed in the document?", "prev_section_summary": "The section provides metadata details about a document titled \"Blank Canvas: Exploring the Absence of Content in Art and Design\" found within the ISPE Records and Data Integrity Guide. It highlights the file size, creation and last modification dates, and poses questions regarding the document's title and content within the pharmaceutical development and data integrity context. The section prompts inquiries into the document's purpose and the reasoning behind its title, suggesting a potential discrepancy between the expected content of the guide and the document's title.", "excerpt_keywords": "ISPE, GAMP, data integrity, electronic records, regulatory guidance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 137\n\n### records and data integrity appendix g1\n\n|references| |\n|---|---|\n|1. mhragmp data integrity definitions and guidance for industry, revision 1.1, march 2015, www.gov.uk/government/publications/good-manufacturing-practice-data-integrity-definitions.| |\n|2. 21 cfr part 11 - electronic records; electronic signatures, code of federal regulations, us food and drug administration (fda), www.fda.gov.| |\n|3. ispe gamp(r) 5: a risk-based approach to compliant gxp computerized systems, international society for pharmaceutical engineering (ispe), fifth edition, february 2008, www.ispe.org.| |\n|4. ispe gamp(r) guidance documents, international society for pharmaceutical engineering (ispe), http://www.ispe.org/publications-guidance-documents/series.| |\n|5. us code of federal regulations (cfrs), https://www.gpo.gov/fdsys/browse/collectioncfr.action?collectioncode=cfr.| |\n|6. eudralex volume 4 - guidelines for good manufacturing practice for medicinal products for human and veterinary use, chapter 4: documentation, january 2011, http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm.| |\n|7. eudralex volume 4 - guidelines for good manufacturing practices for medicinal products for human and veterinary use, annex 11: computerized systems, june 2011, http://ec.europa.eu/health/documents/eudralex/vol-4/index_en.htm.| |\n|8. mhragxp data integrity definitions and guidance for industry, draft version for consultation, july 2016, https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/538871/mhra_gxp_data_integrity_consultation.pdf.| |\n|9. fda draft guidance for industry: data integrity and compliance with cgmp, april 2016, us food and drug administration (fda), www.fda.gov.| |\n|10. international council for harmonisation (ich), ich harmonised tripartite guideline, quality risk management - q9, step 4, 9 november 2005, www.ich.org.| |\n|11. international council for harmonisation (ich), ich harmonised tripartite guideline, pharmaceutical quality system - q10, step 4, 4 june 2008, www.ich.org.| |\n|12. who technical report series, no. 996, annex 5: guidance on good data and record management practices, world health organization (who), 2016, http://apps.who.int/medicinedocs/en/d/js22402en/.| |\n|13. pic/s draft guidance: pi 041-1 (draft 2) good practices for data management and integrity in regulated gmp/gdpenvironments, august 2016, pharmaceutical inspection co-operation scheme (pic/s), https://www.picscheme.org/.| |\n|14. ispeak blog post, \"data integrity, critical thinking & mhra2017, oh my!\" 21 october 2016, http://blog.ispe.org/critical-thinking-data-integrity-mhra.| |\n|15. mhraout of specification investigations guidance, august 2013, https://www.gov.uk/government/publications/out-of-specification-investigations.| |\n|16. fda guidance for industry: part 11, electronic records; electronic signatures - scope and application, august 2003, us food and drug administration (fda), www.fda.gov.| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9be4cb6d-10b8-42c3-9c61-740d772e26d4": {"__data__": {"id_": "9be4cb6d-10b8-42c3-9c61-740d772e26d4", "embedding": null, "metadata": {"page_label": "140", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Regulatory Compliance, Risk Management, and Data Integrity in Pharmaceutical Engineering", "questions_this_excerpt_can_answer": "1. What are the key resources or guides recommended by the ISPE for implementing a risk-based approach to the testing and operation of GxP computerized systems, and what are their publication dates?\n \n2. Can you identify specific regulatory and guidance documents that address the management of data integrity and compliance within the pharmaceutical engineering field, including their issuing organizations and publication or revision dates?\n\n3. What are some of the specialized topics covered in the document that relate to improving pharmaceutical engineering practices, such as cloud computing in a GxP environment or the design and validation of spreadsheets for laboratory use, including the sources and dates of these discussions?", "prev_section_summary": "The section provides a list of key references for understanding the regulatory and guidance framework surrounding data integrity and electronic records in the pharmaceutical industry as of 2024. It includes references from regulatory bodies such as the MHRA, FDA, and PIC/S, as well as guidelines from organizations like ISPE and ICH. The section highlights the importance of compliant GxP computerized systems and data integrity, and provides links to relevant guidance documents and draft versions for ensuring compliance with Good Manufacturing Practices (GMP) in the pharmaceutical sector. Key topics covered include data integrity definitions, electronic records and signatures, risk-based approaches to compliant systems, and guidelines for good manufacturing practices.", "excerpt_keywords": "ISPE, GAMP, data integrity, pharmaceutical engineering, regulatory compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nispe gamp(r) good practice guide: a risk-based approach to testing of gxp systems, international society for pharmaceutical engineering (ispe), second edition, december 2012, www.ispe.org.\nispe gamp(r) good practice guide: a risk-based approach to operation of gxp computerized systems, international society for pharmaceutical engineering (ispe), first edition, january 2010, www.ispe.org.\nispe gamp(r) concept paper: considerations for a corporate data integrity program, international society for pharmaceutical engineering (ispe), march 2016, www.ispe.org.\nmeyer, erin, the culture map, publisher: publicaffairs, 2014, http://www.publicaffairsbooks.com/book/hardcover/ pe-culture-map/9781610392501.\nwingate, guy, \"data integrity: management factors and effective leadership,\" ispe uk annual meeting presentation, 10 november 2016.\nmcauley, gerry, \"optimizing human performance,\" biopharm international, july 2014, volume 27, issue 7.\ncressey, donald r., oper peoples money: a study in pe social psychology of embezzlement, publisher: patterson smip, 1953.\n21 cfr part 211 - current good manufacturing practice for finished pharmaceuticals, code of federal regulations, us food and drug administration (fda), www.fda.gov.\nus fda compliance program guidance manual 7346.832: pre-approval inspections, 2010, us food and drug administration (fda), www.fda.gov.\nassociation of record managers and administrators (arma), www.arma.org.\namerican national standards institute (ansi), www.ansi.org.\nstokes, david, \"compliant cloud computing - managing pe risks,\" pharmaceutical engineering, july/august 2013, www.ispe.org.\nispe gamp(r) cloud computing special interest group (sig), \"cloud computing in a gxp environment: the promise, pe reality and pe pap to clarity,\" pharmaceutical engineering, january/february 2014, www.ispe.org.\nus fda field science and laboratories: laboratory manual, volume iii- laboratory operations, applications and programs, section 4.5 - development and validation of spreadsheets for calculation of data, https://www. fda.gov/scienceresearch/fieldscience/laboratorymanual.\ndfs/ora laboratory information bulletin no. 4317, spreadsheet design and validation for pe multi-user application for pe chemistry laboratory - part i, 2004, division of field science (dfs)/office of regulatory affairs (ora), us food and drug administration (fda), www.fda.gov.\n21 cfr part 58.190 - good laboratory practice (glp) for nonclinical laboratory studies; storage and retrieval of records and data, code of federal regulations, us food and drug administration (fda), www.fda.gov.\neudralex volume 4 - guidelines for good manufacturing practices for medicinal products for human and veterinary use, chapter 6: quality control, october 2014, http://ec.europa.eu/healp/documents/eudralex/vol-4/index_en.htm.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a0d101a4-286c-4579-aac2-40278be78b56": {"__data__": {"id_": "a0d101a4-286c-4579-aac2-40278be78b56", "embedding": null, "metadata": {"page_label": "141", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Regulatory Guidelines and Standards in the Pharmaceutical Industry: A Comprehensive Overview", "questions_this_excerpt_can_answer": "1. What specific ISO guide is referenced in the ISPE Records and Data Integrity Guide for incorporating safety aspects into standards, and where can it be found?\n \n2. As of the document's last update, which NIST publication provides the definition of cloud computing and its publication date?\n\n3. What FDA guidance document is mentioned in the ISPE Records and Data Integrity Guide that pertains to electronic source data in clinical investigations, including its publication month and year?", "prev_section_summary": "The section discusses key resources and guides recommended by the ISPE for implementing a risk-based approach to testing and operating GxP computerized systems. It also covers regulatory and guidance documents related to data integrity and compliance in pharmaceutical engineering, including issuing organizations and publication dates. Specialized topics such as cloud computing in a GxP environment and the design and validation of spreadsheets for laboratory use are also addressed. Entities mentioned include the ISPE, FDA, ARMA, ANSI, and various authors and publications related to pharmaceutical engineering practices.", "excerpt_keywords": "ISPE, Records, Data Integrity, ISO, FDA"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n|ispe gamp(r) guide:|page 139|\n|---|---|\n|records and data integrity|appendix g1|\n|34.|iso/iec guide 51:2014 safety aspects -- guidelines for their inclusion in standards, international organization for standardization (iso), www.iso.org, and international electronical commission (iec), www.iec.ch.|\n|35.|nist sp 800-145, the nist definition of cloud computing, september 2011, national institute of standards and technology (nist), www.nist.gov.|\n|36.|ispe glossary of pharmaceutical and biotechnology terminology, www.ispe.org.|\n|37.|21 cfr part 58 - good laboratory practice (glp) for nonclinical laboratory studies, code of federal regulations, us food and drug administration (fda), www.fda.gov.|\n|38.|iso guide 73:2002 risk management - vocabulary - guidelines for use in standards, international standards organization (iso), www.iso.org.|\n|39.|fda guidance for industry: electronic source data in clinical investigations, september 2013, us food and drug administration (fda), www.fda.gov.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c0e91153-90ae-4095-bdba-f028e388dc42": {"__data__": {"id_": "c0e91153-90ae-4095-bdba-f028e388dc42", "embedding": null, "metadata": {"page_label": "142", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Exploring the Concept of Empty Space", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that this specific context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What are the specific guidelines or recommendations provided by the ISPE regarding records and data integrity in the pharmaceutical industry?**\n - Given the document title \"ISPE Records and Data Integrity Guide,\" this context is uniquely positioned to offer detailed insights, standards, and practices recommended by the International Society for Pharmaceutical Engineering (ISPE) for maintaining records and data integrity within the pharmaceutical sector. This can include methodologies for ensuring the accuracy, reliability, and consistency of data throughout its lifecycle.\n\n2. **How does the document address the challenges and solutions related to data integrity in the context of pharmaceutical development and manufacturing?**\n - The context suggests that the document is a comprehensive guide, likely detailing common challenges faced by pharmaceutical companies in ensuring data integrity and providing practical solutions or best practices to overcome these challenges. This could encompass aspects such as electronic records management, audit trails, data storage and retrieval, and compliance with regulatory requirements.\n\n3. **What are the implications of the document's creation and last modification dates for the pharmaceutical industry's practices in records and data management?**\n - With the creation date being April 7, 2024, and the last modification date being April 4, 2024, this document represents a very recent perspective on data integrity issues within the pharmaceutical industry. Questions can be raised about how recent developments, technological advancements, or regulatory changes have influenced the guidelines provided in this document. Additionally, it may offer insights into emerging trends or future directions in pharmaceutical data management practices.\n\nThese questions leverage the unique context of the document to explore its content and relevance to the pharmaceutical industry's ongoing efforts to maintain high standards of data integrity and compliance.", "prev_section_summary": "The section discusses various regulatory guidelines and standards in the pharmaceutical industry, including references to specific ISO guides, NIST publications, and FDA guidance documents. Key topics include safety aspects in standards, the definition of cloud computing, good laboratory practices, risk management vocabulary, and electronic source data in clinical investigations. Entities mentioned include the International Organization for Standardization (ISO), the International Electronical Commission (IEC), the National Institute of Standards and Technology (NIST), the US Food and Drug Administration (FDA), and the ISPE (International Society for Pharmaceutical Engineering).", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Industry, Guidelines"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6d2428e2-11f1-41c5-9a4b-871028fd5df7": {"__data__": {"id_": "6d2428e2-11f1-41c5-9a4b-871028fd5df7", "embedding": null, "metadata": {"page_label": "143", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Pharmaceutical Quality Management Glossary: A Comprehensive Guide to Terminology and Definitions in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. What does the acronym \"ALCOA+\" stand for in the context of data integrity within the pharmaceutical industry, and how does it expand upon the original \"ALCOA\" principles?\n \n2. In the realm of pharmaceutical quality management, what specific standards or organizations does the acronym \"ANSI\" refer to, and how might it be relevant to the industry's regulatory compliance?\n\n3. Can you detail the significance of the \"CAPA\" system in pharmaceutical manufacturing practices and how it relates to maintaining or improving quality standards according to the ISPE Records and Data Integrity Guide?", "prev_section_summary": "The section discusses the ISPE Records and Data Integrity Guide, focusing on guidelines and recommendations for maintaining data integrity in the pharmaceutical industry. It addresses challenges and solutions related to data integrity in pharmaceutical development and manufacturing, including electronic records management and compliance with regulatory requirements. The creation and modification dates of the document suggest a recent perspective on data integrity issues, potentially reflecting emerging trends and future directions in pharmaceutical data management practices.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Quality Management, Glossary"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## appendix g2 - glossary\n\n|acronyms and abbreviations|\n|---|\n|adr|adverse drug reaction|\n|ae|adverse event|\n|alcoa|attributable, legible, contemporaneous, original, accurate|\n|alcoa+|alcoa, with the addition of complete, consistent, enduring, available|\n|ansi|american national standards institute (us)|\n|api|active pharmaceutical ingredient|\n|aql|acceptable quality limit or acceptance quality level|\n|arma|association of record managers and administrators (us)|\n|capa|corrective and preventive action|\n|cdo|chief data officer|\n|ceo|chief executive officer|\n|cds|chromatography data system|\n|cgmp|current good manufacturing practice|\n|cfr|code of federal regulations|\n|cmmi|capability maturity model integration|\n|dba|data base administrator|\n|ds|design specification|\n|edms|electronic document management system|\n|erp|enterprise resource planning|\n|eu|european union|\n|faq|frequently asked questions|\n|fda|food & drug administration (us)|\n|fs|functional specification|\n|gamp(r)|good automated manufacturing practice|\n|gc|gas chromatography|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "df6e22a6-4d98-4305-93f0-19d675691022": {"__data__": {"id_": "df6e22a6-4d98-4305-93f0-19d675691022", "embedding": null, "metadata": {"page_label": "144", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Guide to Good Practices in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. What does the acronym \"gdocp\" stand for in the context of good practices within the pharmaceutical industry, as outlined in the ISPE Records and Data Integrity Guide?\n\n2. How does the ISPE Records and Data Integrity Guide define the relationship between \"gmp\" and \"gxp\" in the context of ensuring quality and compliance in pharmaceutical operations?\n\n3. In the ISPE Records and Data Integrity Guide, which regulatory agency is identified by the acronym \"mhra,\" and what is its geographical jurisdiction?", "prev_section_summary": "The section provides a glossary of acronyms and abbreviations commonly used in the pharmaceutical industry, including definitions for terms like ALCOA, ALCOA+, ANSI, API, AQL, ARMA, CAPA, CEO, CGMP, CFR, CMMI, DBA, DS, EDMS, ERP, EU, FAQ, FDA, FS, GAMP, GC, among others. It also addresses the significance of these terms in relation to data integrity, quality management, regulatory compliance, and manufacturing practices within the pharmaceutical industry.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Industry, Good Practices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix g2\n\n|gcp|good clinical practice|\n|---|---|\n|gdp|good distribution practice|\n|gdocp|good documentation practice|\n|gep|good engineering practice|\n|glp|good laboratory practice|\n|gmp|good manufacturing practice|\n|gpvp|good pharmacovigilance practice|\n|gvp|good pharmacovigilance practices|\n|gxp|good \"x\" practice|\n|hr|human resources|\n|iaas|infrastructure as a service|\n|ich|international council for harmonisation|\n|it|information technology|\n|jpeg|joint photographic experts group|\n|kpi|key performance indicator|\n|lc|liquid chromatography|\n|lims|laboratory information management system|\n|mes|manufacturing execution system|\n|mhra|medicines and healthcare products regulatory agency (united kingdom)|\n|os|operating system|\n|oos|out of specification|\n|paas|platform as a service|\n|pc|personal computer|\n|pdf|portable document format|\n|pic/s|pharmaceutical inspection convention and pharmaceutical inspection cooperation scheme|\n|qa|quality assurance|\n|qc|quality control|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a1636b6d-58a1-4f6f-bb1c-60f90b9d771a": {"__data__": {"id_": "a1636b6d-58a1-4f6f-bb1c-60f90b9d771a", "embedding": null, "metadata": {"page_label": "145", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE GAMP(r) Guide: Definitions and Acronyms", "questions_this_excerpt_can_answer": "1. What specific definitions or examples does the ISPE GAMP(r) Guide provide for identifying atypical, aberrant, or anomalous results in the context of MHRA out of specification investigations guidance?\n\n2. How does the ISPE GAMP(r) Guide define biometrics in accordance with US FDA, 21 CFR Part 11, and what are the key characteristics that make biometrics a reliable method for verifying an individual's identity?\n\n3. Can the ISPE GAMP(r) Guide elaborate on the concept of critical thinking within the pharmaceutical industry's context, particularly in evaluating information for quality management and risk assessment?", "prev_section_summary": "The section provides a list of acronyms and their corresponding definitions related to good practices within the pharmaceutical industry as outlined in the ISPE Records and Data Integrity Guide. Key topics include good manufacturing practice (GMP), good documentation practice (GDocP), good clinical practice (GCP), and others. The section also mentions regulatory agencies such as the Medicines and Healthcare Products Regulatory Agency (MHRA) and organizations like the International Council for Harmonisation (ICH). Overall, the section covers essential terms and concepts essential for ensuring quality and compliance in pharmaceutical operations.", "excerpt_keywords": "ISPE, GAMP, data integrity, biometrics, critical thinking"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: page 143\n\n## records and data integrity appendix g2\n\n|qms|quality management system|\n|---|---|\n|qrm|quality risk management|\n|raid|redundant array of independent disks|\n|rpo|recovery point objective|\n|rto|recovery time objective|\n|saas|software as a service|\n|sdlc|software development life cycle|\n|sds|software design specification|\n|sgml|standard generalized markup language|\n|sla|service level agreement|\n|sme|subject matter expert|\n|sop|standard operating procedure|\n|uat|user acceptance test|\n|ups|uninterruptible power supply|\n|urs|user requirements specification|\n|who|world health organization|\n|xml|extensible markup language|\n\n## 21.2 definitions\n\natypical / aberrant / anomalous result (mhra out of specification investigations guidance [15])\n\nresults that are still within specification but are unexpected, questionable, irregular, deviant or abnormal. examples would be chromatograms that show unexpected peaks, unexpected results for stability test point, etc.\n\nbiometrics (us fda, 21 cfr part 11 [2])\n\na method of verifying an individuals identity based on measurement of the individuals physical feature(s) or repeatable action(s) where those features and/or actions are both unique to that individual and measurable.\n\ncritical thinking\n\na systematic, rational, and disciplined process of evaluating information from a variety of perspectives to yield a balanced and well-reasoned answer.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e9f4e436-7d08-4e52-9d61-79d9fab06385": {"__data__": {"id_": "e9f4e436-7d08-4e52-9d61-79d9fab06385", "embedding": null, "metadata": {"page_label": "146", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Governance in Regulated Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What definition does the MHRA 2015 guidance provide for \"data governance\" in the context of regulated computerized systems, and how does it emphasize the importance of data throughout its life cycle?\n\n2. How does the ISPE GAMP(R) Guide: Appendix G2 differentiate between the roles of a \"data owner\" and a \"data steward\" in ensuring data integrity and compliance within a regulated environment?\n\n3. According to the US FDA's 21 CFR Part 11, how is a \"digital signature\" distinguished from an \"electronic signature,\" and what implications does this distinction have for the authentication and integrity of data in regulated computerized systems?", "prev_section_summary": "The section provides definitions and acronyms related to quality management systems, quality risk management, and software development life cycle. It also defines terms such as atypical, aberrant, and anomalous results in the context of MHRA out of specification investigations guidance, biometrics according to US FDA regulations, and critical thinking in the pharmaceutical industry. The section emphasizes the importance of evaluating information for quality management and risk assessment in a systematic and rational manner.", "excerpt_keywords": "data governance, data integrity, data owner, data steward, electronic signature"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix g2 records and data integrity\n\ndata governance (mhra, 2015 [1])\n\nthe sum total of arrangements to ensure that data, irrespective of the format in which it is generated, is recorded, processed, retained and used to ensure a complete, consistent and accurate record throughout the data life cycle.\n\ndata integrity (mhra, 2015 [1])\n\nthe extent to which all data are complete, consistent and accurate throughout the data life cycle.\n\ndata owner\n\nthe person ultimately responsible for the integrity and compliance of specific data at various stages of the data life cycle in accordance with applicable policies and sops. the data owner may also be the process owner.\n\ndata steward\n\na person with specific tactical coordination and implementation responsibilities for data integrity, responsible for carrying out data usage, management, and security policies as determined by wider data governance initiatives, such as acting as a liaison between the it department and the business. they are typically members of the operational unit or department creating, maintaining, or using the data, for example, personnel on the shop floor or in the laboratories who actually generate, manage, and handle the data.\n\ndetectability (ich q9 [10])\n\nthe ability to discover or determine the existence, presence, or fact of a hazard.\n\ndigital signature (us fda, 21 cfr part 11 [2])\n\nan electronic signature based upon cryptographic methods of originator authentication, computed by using a set of rules and a set of parameters such that the identity of the signer and the integrity of the data can be verified.\n\nelectronic record (us fda, 21 cfr part 11 [2])\n\nany combination of text, graphics, data, audio, pictorial, or other information representation in digital form that is created, modified, maintained, archived, retrieved, or distributed by a computer system.\n\nelectronic signature (us fda, 21 cfr part 11 [2])\n\na computer data compilation of any symbol or series of symbols executed, adopted, or authorized by an individual to be the legally binding equivalent of the individuals handwritten signature.\n\ngxp regulated computerized system (ispe gamp(r) 5 [3])\n\ncomputerized systems that are subject to gxp regulations. the regulated company must ensure that such systems comply with the appropriate regulations.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1068dcdd-a59b-40cb-b46f-a3b4c70dc523": {"__data__": {"id_": "1068dcdd-a59b-40cb-b46f-a3b4c70dc523", "embedding": null, "metadata": {"page_label": "147", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Regulatory Compliance and Data Integrity in the Pharmaceutical Industry", "questions_this_excerpt_can_answer": "1. What are the specific types of Good Practice (GxP) regulations outlined in the ISPE Records and Data Integrity Guide that are applicable to the pharmaceutical industry, and how do they contribute to ensuring regulatory compliance and data integrity?\n\n2. How does the ISPE Records and Data Integrity Guide define a \"hybrid situation\" in the context of records and data management, and what implications does this have for pharmaceutical companies managing both paper and electronic records?\n\n3. According to the ISPE Records and Data Integrity Guide, how is \"metadata\" defined in the context of pharmaceutical data management, and what role does it play in ensuring the integrity and traceability of data within the industry?", "prev_section_summary": "This section discusses key topics related to data integrity and governance in regulated computerized systems. It covers definitions of data governance and data integrity, the roles of data owner and data steward in ensuring compliance, detectability of hazards, digital and electronic signatures, and the definition of electronic records. The section also mentions the importance of GxP regulated computerized systems and the need for compliance with regulations.", "excerpt_keywords": "ISPE, Records, Data Integrity, Pharmaceutical Industry, Regulatory Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\ngxp regulation\nthe underlying international pharmaceutical requirements, such as pose set forp in pe us fd&c act, us phs act, fda regulations, eu directives and guidelines, japanese regulations, or oper applicable national legislation or regulations under which a company operates. these include but are not limited to:\n* good manufacturing practice (gmp) (pharmaceutical, including active pharmaceutical ingredient (api), veterinary, and blood)\n* good clinical practice (gcp)\n* good laboratory practice (glp)\n* good distribution practice (gdp)\n* good pharmacovigilance practice (gvp, also known as gpvp)\n\nharm (ich q9 [10])\ndamage to healp, including pe damage pat can occur from loss of product quality or availability.\n\nhazard (iso/iec guide 51 [34])\nthe potential source of harm\n\nhybrid situation\na situation where paper and electronic record and signature components co-exist.\n\ninfrastructure as a service (iaas) (nist special publication 800-145 [35])\nthe capability provided to pe consumer is to provision processing, storage, networks, and oper fundamental computing resources where pe consumer is able to deploy and run arbitrary software, which can include operating systems and applications. the consumer does not manage or control pe underlying cloud infrastructure but has control over operating systems, storage, and deployed applications; and possibly limited control of select networking components e.g. host firewalls.\n\nlikelihood of occurrence\nthe probability of a hazard occurring and causing harm.\n\nmetadata (mhra, 2016 [8])\ndata pat describe pe attributes of oper data, and provide context and meaning. typically, pese are data pat describe pe structure, data elements, inter-relationships and oper characteristics of data. it also permits data to be attributable to an individual (or if automatically generated, to pe original data source).\n\nout of specification (oos) (ispe glossary [36])\nan examination, measurement, or test result pat does not comply wip pre-established criteria.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "eb80239a-fe08-43c7-ba0b-62e7f006a355": {"__data__": {"id_": "eb80239a-fe08-43c7-ba0b-62e7f006a355", "embedding": null, "metadata": {"page_label": "148", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Record Management in Cloud Infrastructure through Platform as a Service (PaaS)", "questions_this_excerpt_can_answer": "1. How does the ISPE GAMP(R) Guide: Appendix G2 define the capabilities provided to consumers through Platform as a Service (PaaS) in terms of deployment and management of applications on cloud infrastructure?\n \n2. According to the ISPE Records and Data Integrity Guide, what constitutes raw data in the context of nonclinical laboratory studies as defined by the US FDA, including the proposed amendment in the Federal Register on November 22, 2016?\n\n3. What is the definition of a \"regulated record\" as outlined in the ISPE Records and Data Integrity Guide, and how does it differentiate from \"regulated data\" in terms of its purpose, content, and regulatory requirements?", "prev_section_summary": "The section discusses the importance of records and data integrity in the pharmaceutical industry, outlining specific Good Practice (GxP) regulations such as GMP, GCP, GLP, GDP, and GVP. It defines a \"hybrid situation\" as the co-existence of paper and electronic records, and explains the role of metadata in data management for ensuring integrity and traceability. Additionally, it mentions infrastructure as a service (IaaS), likelihood of occurrence, hazards, and out of specification (OOS) results.", "excerpt_keywords": "ISPE, GAMP, data integrity, cloud infrastructure, regulated record"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## ispe gamp(r) guide: appendix g2 platform as a service (paas) (nist special publication 800-145 [35]) records and data integrity\n\nthe capability provided to the consumer is to deploy onto the cloud infrastructure consumer-created or acquired applications created using programming languages, libraries, services, and tools supported by the provider. the consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, or storage, but has control over the deployed applications and possibly configuration settings for the application-hosting environment.\n\n|primary record|(mhra, 2015 [1])|\n|---|---|\n|the record which takes primacy in cases where data that are collected and retained concurrently by more than one method fail to concur.| |\n|probability of detection|the probability that a fault will be detected before harm occurs.|\n|process owner|(ispe gamp(r) 5 [3])|\n|the person ultimately responsible for the business process or processes being managed.| |\n|quality risk management (qrm)|(ich q9 [10])|\n|a systematic process for the assessment, control, communication and review of risks to the quality of the drug (medicinal) product across the product lifecycle.| |\n|raw data| |\n|1. any laboratory worksheets, records, memoranda, notes, or exact copies thereof, that are the result of original observations and activities of a nonclinical laboratory study and are necessary for the reconstruction and evaluation of the report of that study. in the event that exact transcripts of raw data have been prepared (e.g., tapes which have been transcribed verbatim, dated, and verified accurate by signature), the exact copy or exact transcript may be substituted for the original source as raw data. raw data may include photographs, microfilm or microfiche copies, computer printouts, magnetic media, including dictated observations, and recorded data from automated instruments. (us fda, 21 cfr part 58, subpart a--general provisions, sec. 58.3 [37])| |\n|2. all original nonclinical laboratory study records and documentation or exact copies that maintain the original intent and meaning and are made according to the persons certified copy procedures. raw data includes any laboratory worksheets, correspondence, notes, and other documentation (regardless of capture medium) that are the result of original observations and activities of a nonclinical laboratory study and are necessary for the reconstruction and evaluation of the report of that study. raw data also includes the signed and dated pathology report. (us fda, 21 cfr part 58, subpart a--general provisions, sec. 58.3 definitions - proposed amendment in federal register, 81 fr 58341, 22 november 2016 [37])| |\n|regulated data| |\n|information used for a regulated purpose or to support a regulated process.| |\n|regulated record| |\n|a collection of regulated data (and any metadata necessary to provide meaning and context) with a specific gxp purpose, content, and meaning, and required by gxp regulations. records include instructions as well as data and reports.| |", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6907425c-660e-4aff-9b60-3ad05c6dcf97": {"__data__": {"id_": "6907425c-660e-4aff-9b60-3ad05c6dcf97", "embedding": null, "metadata": {"page_label": "149", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Risk Management and Data Integrity in Software Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What is the definition of \"risk\" as per ISO/IEC Guide 51, and how does it relate to the probability and severity of harm in the context of data integrity and risk management in software systems?\n\n2. How does the ISPE Records and Data Integrity Guide describe the role and responsibilities of a system owner according to GAMP\u00ae 5, particularly in ensuring the security and maintenance of data within a system?\n\n3. In the context of clinical investigations, how does the FDA guidance for industry define \"source data,\" and what implications does this definition have for the reconstruction and evaluation of clinical trials according to the excerpt from the document titled \"Risk Management and Data Integrity in Software Systems: A Comprehensive Guide\"?", "prev_section_summary": "The section discusses the capabilities provided to consumers through Platform as a Service (PaaS) in terms of deploying applications on cloud infrastructure, as defined by the ISPE GAMP(R) Guide: Appendix G2. It also defines raw data in the context of nonclinical laboratory studies according to the US FDA, including a proposed amendment in the Federal Register. Additionally, it outlines the definition of a \"regulated record\" in comparison to \"regulated data\" in terms of purpose, content, and regulatory requirements. Key entities mentioned include primary record, process owner, quality risk management (QRM), raw data, regulated data, and regulated record.", "excerpt_keywords": "Risk management, Data integrity, Software systems, ISO/IEC Guide 51, FDA guidance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n## records and data integrity\n\n|regression testing|testing geared toward demonstrating that a change has not affected a system or part of a system that it was not intended to affect.|\n|---|---|\n|risk (iso/iec guide 51 [34])|the combination of the probability of occurrence of harm and the severity of that harm.|\n|risk assessment (ich q9 [10])|a systematic process of organizing information to support a risk decision to be made within a risk management process. it consists of the identification of hazards and the analysis and evaluation of risks associated with exposure to those hazards.|\n|risk control (iso guide 73 [38])|actions implementing risk management decisions.|\n|risk identification (ich q9 [10])|the systematic use of information to identify potential sources of harm (hazards) referring to the risk question or problem description.|\n|software as a service (saas) (nist special publication 800-145 [35])|the capability provided to the consumer is to use the providers applications running on a cloud infrastructure. the applications are accessible from various client devices through either a thin client interface, such as a web browser (e.g., web-based email), or a program interface. the consumer does not manage or control the underlying cloud infrastructure including network, servers, operating systems, storage, or even individual application capabilities, with the possible exception of limited user-specific application configuration settings.|\n|service level agreement (sla)|an agreement between an it service provider and a customer. the sla describes the it service, documents service level targets, and specifies the responsibilities of the it service provider and the customer. a single sla may cover multiple services or multiple customers.|\n|severity (ich q9 [10])|a measure of the possible consequences of a hazard.|\n|source data (fda guidance for industry: electronic source data in clinical investigations [39])|all information in original records and certified copies of original records of clinical findings, observations, or other activities (in a clinical investigation) used for the reconstruction and evaluation of the trial. source data are contained in source documents (original records or certified copies).|\n|system owner (ispe gamp (r) 5 [3])|the person ultimately responsible for the availability, and support and maintenance, of a system and for the security of the data residing on that system.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "915b9556-ef46-4c11-8683-ec7df1062351": {"__data__": {"id_": "915b9556-ef46-4c11-8683-ec7df1062351", "embedding": null, "metadata": {"page_label": "150", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: Exploring the Absence of Content in Art and Design\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the significance of the document titled \"Blank Canvas: Exploring the Absence of Content in Art and Design\" within the ISPE Records and Data Integrity Guide?**\n - This question seeks to understand the relevance or connection of a document seemingly focused on art and design within the context of the ISPE (International Society for Pharmaceutical Engineering) guide, which typically focuses on pharmaceutical engineering, records, and data integrity. The unique title suggests a thematic exploration or metaphorical approach that might be used to discuss concepts of data integrity or record-keeping in an unconventional manner.\n\n2. **How does the document \"Blank Canvas: Exploring the Absence of Content in Art and Design\" contribute to the understanding of data integrity in the pharmaceutical industry?**\n - Given the unusual juxtaposition of art and design themes with pharmaceutical data integrity, this question probes into how concepts of emptiness or minimalism in art might be applied to the principles of data integrity, record accuracy, and reliability in a highly regulated industry. It implies that the document might offer innovative perspectives or methodologies for ensuring data integrity.\n\n3. **What methodologies or frameworks does the \"Blank Canvas\" document propose for addressing challenges in records and data integrity within the pharmaceutical sector?**\n - This question delves into the specifics of the document, seeking to uncover any novel approaches, frameworks, or methodologies it proposes for tackling the perennial challenges of maintaining records and ensuring data integrity in the pharmaceutical industry. It assumes that the document, despite its title suggesting a focus on art and design, provides actionable insights or strategies relevant to pharmaceutical professionals and regulators.\n\nThese questions are designed to explore the unique intersection of art/design concepts with the technical and regulatory aspects of pharmaceutical data integrity, as suggested by the context provided.", "prev_section_summary": "The section discusses key topics related to records and data integrity in software systems, including regression testing, risk assessment, risk control, software as a service (SaaS), service level agreements (SLAs), severity, and system owners. It also defines important terms such as risk, risk assessment, risk control, risk identification, software as a service, service level agreements, severity, and source data. The section highlights the responsibilities of a system owner in ensuring the security and maintenance of data within a system, as well as the definition of source data in clinical investigations according to FDA guidance.", "excerpt_keywords": "ISPE, Records, Data Integrity, Art, Design"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "30b9c515-3082-4fd5-b577-481d92c8ff24": {"__data__": {"id_": "30b9c515-3082-4fd5-b577-481d92c8ff24", "embedding": null, "metadata": {"page_label": "151", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Blank Canvas: A Collection of Absence\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What is the file size of the ISPE Records and Data Integrity Guide as stored in the specified directory?**\n - This question is directly answerable with the provided file size information (6245344 bytes), which is a specific detail unlikely to be found in other sources without access to the exact file in question.\n\n2. **What are the creation and last modification dates of the document titled \"Blank Canvas: A Collection of Absence\" found within the ISPE Records and Data Integrity Guide?**\n - The context provides unique creation (2024-04-07) and last modification dates (2024-04-04) for the document, which are specific to this version of the document and would not be easily found elsewhere, especially considering the document's title suggests it might be a unique or specialized section within the guide.\n\n3. **What is the document title associated with the ISPE Records and Data Integrity Guide PDF stored in the specified file path, and how does it relate to the guide's content?**\n - Given the document title \"Blank Canvas: A Collection of Absence,\" this question seeks to understand the relevance or connection of this title to the content or purpose of the ISPE Records and Data Integrity Guide. This is a nuanced inquiry that requires access to the document itself or insights provided by the context, as the title might suggest a thematic or metaphorical aspect specific to this version or edition of the guide, which would not be readily apparent without direct reference to the document or an in-depth understanding of its contents and objectives.", "prev_section_summary": "The section discusses a document titled \"Blank Canvas: Exploring the Absence of Content in Art and Design\" within the ISPE Records and Data Integrity Guide. It explores the significance of this document in relation to data integrity in the pharmaceutical industry, how it contributes to understanding data integrity, and what methodologies or frameworks it proposes for addressing challenges in records and data integrity within the pharmaceutical sector. The section highlights the unique intersection of art/design concepts with technical and regulatory aspects of pharmaceutical data integrity.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, Pharmaceutical Industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "fcfc4d72-d760-4179-845f-57d890accc4c": {"__data__": {"id_": "fcfc4d72-d760-4179-845f-57d890accc4c", "embedding": null, "metadata": {"page_label": "152", "file_name": "[13] ISPE Records and Data Integrity Guide.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[13] ISPE Records and Data Integrity Guide.pdf", "file_type": "application/pdf", "file_size": 6245344, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "ISPE Tampa Office Contact Information and Employee Directory", "questions_this_excerpt_can_answer": "1. What is the physical address of the ISPE office located in Tampa, Florida, as mentioned in the ISPE Records and Data Integrity Guide?\n \n2. As of the guide's last update in April 2024, what are the contact details, including telephone and fax numbers, for the ISPE Tampa office?\n\n3. Where can one find the ISPE Tampa Office Contact Information and Employee Directory within the ISPE Records and Data Integrity Guide, including the specific page number?", "prev_section_summary": "The key topics of this section include the file details of the ISPE Records and Data Integrity Guide, such as file size, creation date, and last modification date. The document title \"Blank Canvas: A Collection of Absence\" is also highlighted, suggesting a unique or specialized section within the guide. The section emphasizes the importance of understanding specific details and context related to the document to gain insights into its content and purpose.", "excerpt_keywords": "ISPE, Records, Data Integrity, Guide, Tampa Office"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[13] ISPE Records and Data Integrity Guide.pdf\n600 n. westshore blvd., suite 900, tampa, florida 33609 usa\n\ntel: +1-813-960-2105, fax: +1-813-264-2816\n\nwww.ispe.org", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5089da61-cabb-48d8-a92e-25e1608fe005": {"__data__": {"id_": "5089da61-cabb-48d8-a92e-25e1608fe005", "embedding": null, "metadata": {"page_label": "A", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "WHO Expert Committee on Pharmaceutical Preparations: Specifications and Recommendations for Pharmaceutical Products", "questions_this_excerpt_can_answer": "1. What is the title of the fifty-fifth report published by the WHO Expert Committee on Specifications for Pharmaceutical Preparations, as documented in the WHO Technical Report Series 1033?\n\n2. As of the document's last modification in March 2024, what specific guidelines related to data integrity are included in Annex 4 of the WHO Technical Report Series 1033?\n\n3. How does the WHO Technical Report Series 1033, specifically within its Annex 4, contribute to the development and maintenance of pharmaceutical product specifications and recommendations, according to the document titled \"WHO Expert Committee on Pharmaceutical Preparations: Specifications and Recommendations for Pharmaceutical Products\"?", "excerpt_keywords": "WHO, Technical Report Series, Pharmaceutical Preparations, Specifications, Recommendations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n# w h o technical report series 1033\n\nwho expert committee on specifications for pharmaceutical preparations fifty-fifth report", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3be8fcdb-fc57-4afb-8214-3486ba4f76f5": {"__data__": {"id_": "3be8fcdb-fc57-4afb-8214-3486ba4f76f5", "embedding": null, "metadata": {"page_label": "1", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Management in Pharmaceutical Quality Systems: Ensuring Compliance and Quality Assurance", "questions_this_excerpt_can_answer": "1. What specific guidelines does the WHO TR 1033 Annex 4 provide for managing data integrity within pharmaceutical quality systems, particularly in relation to data governance and quality risk management?\n\n2. How does the document detail the implementation of corrective and preventive actions (CAPA) in the context of data integrity issues within pharmaceutical operations, according to the sections listed in the excerpt?\n\n3. What are the recommended practices for data review and approval in pharmaceutical companies as outlined in the WHO TR 1033 Annex 4 guideline, and how do these practices ensure compliance and quality assurance?", "prev_section_summary": "The section discusses the WHO Technical Report Series 1033, specifically focusing on the fifty-fifth report by the WHO Expert Committee on Specifications for Pharmaceutical Preparations. It highlights the guidelines related to data integrity included in Annex 4 of the report and how this contributes to the development and maintenance of pharmaceutical product specifications and recommendations.", "excerpt_keywords": "WHO TR 1033, data integrity, pharmaceutical quality systems, corrective and preventive actions, data review and approval"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n|content|page number|\n|---|---|\n|introduction and background|137|\n|scope|137|\n|glossary|138|\n|data governance|140|\n|quality risk management|144|\n|management review|145|\n|outsourcing|146|\n|training|146|\n|data, data transfer and data processing|147|\n|good documentation practices|148|\n|computerized systems|149|\n|data review and approval|152|\n|corrective and preventive actions|152|\n|references|153|\n|further reading|153|\n|appendix 1 examples in data integrity management|155|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "84022bb6-1e17-41de-99fd-d2ca927fb232": {"__data__": {"id_": "84022bb6-1e17-41de-99fd-d2ca927fb232", "embedding": null, "metadata": {"page_label": "2", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Guidelines for Good Data and Record Management Practices to Strengthen Data Integrity in Pharmaceutical Preparations: A WHO Perspective\"", "questions_this_excerpt_can_answer": "1. What are the primary reasons identified by the WHO for the increasing observations related to data integrity issues in GMP, GCP, GLP, and GTDP inspections?\n \n2. How does the WHO define the principles of data governance and management to ensure the reliability of data and records in GxP activities, as outlined in the \"Guidelines for Good Data and Record Management Practices to Strengthen Data Integrity in Pharmaceutical Preparations\"?\n\n3. What specific guidance does the WHO provide for contract givers to ensure the integrity of data provided by contract acceptors, as mentioned in the \"Guidelines for Good Data and Record Management Practices to Strengthen Data Integrity in Pharmaceutical Preparations\"?", "prev_section_summary": "The section provides guidelines on data integrity management within pharmaceutical quality systems, focusing on data governance, quality risk management, management review, outsourcing, training, data transfer and processing, good documentation practices, computerized systems, data review and approval, and corrective and preventive actions. The document outlines specific practices and recommendations to ensure compliance and quality assurance in pharmaceutical operations. Key entities include the WHO TR 1033 Annex 4 guideline, corrective and preventive actions (CAPA), data review and approval processes, and examples in data integrity management.", "excerpt_keywords": "WHO, data integrity, pharmaceutical preparations, GxP activities, regulatory requirements"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n1. introduction and background\n\n1.1. in recent years, the number of observations made regarding the integrity of data, documentation and record management practices during inspections of good manufacturing practice (gmp), good clinical practice (gcp), good laboratory practice (glp) and good trade and distribution practices (gtdp) have been increasing. the possible causes for this may include (i) reliance on inadequate human practices; (ii) poorly defined procedures; (iii) resource constraints; (iv) the use of computerized systems that are not capable of meeting regulatory requirements or are inappropriately managed and validated; (v) inappropriate and inadequate control of data flow; and (vi) failure to adequately review and manage original data and records.\n\n1.2. data governance and related measures should be part of a quality system, and are important to ensure the reliability of data and records in good practice (gxp) activities and regulatory submissions. the data and records should be attributable, legible, contemporaneous, original and accurate, complete, consistent, enduring, and available; commonly referred to as \"alcoa+\".\n\n1.3. this document replaces the who guidance on good data and record management practices (annex 5, who technical report series, no. 996, 2016).\n\n2. scope\n\n2.1. this document provides information, guidance and recommendations to strengthen data integrity in support of product quality, safety and efficacy. the aim is to ensure compliance with regulatory requirements in, for example clinical research, production and quality control, which ultimately contributes to patient safety. it covers electronic, paper and hybrid systems.\n\n2.2. the guideline covers \"gxp\" for medical products. the principles could also be applied to other products such as vector control products.\n\n2.3. the principles of this guideline also apply to contract givers and contract acceptors. contract givers are ultimately responsible for the integrity of data provided to them by contract acceptors. contract givers should therefore ensure that contract acceptors have the appropriate capabilities and comply with the principles contained in this guideline and documented in quality agreements.\n\nwho technical report series, no. 1033, 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a69ca2fc-6a9f-4dac-902e-2035b92c9f7b": {"__data__": {"id_": "a69ca2fc-6a9f-4dac-902e-2035b92c9f7b", "embedding": null, "metadata": {"page_label": "3", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Guidelines and Glossary: Ensuring Accuracy and Consistency in Data Management", "questions_this_excerpt_can_answer": "1. How does the WHO TR 1033 Annex 4 Guideline on Data Integrity define the acronym ALCOA+ and what additional attributes does it emphasize beyond the original ALCOA principles?\n \n2. What specific processes and controls does the WHO TR 1033 Annex 4 Guideline recommend for archiving GxP records to ensure their integrity and availability throughout the required retention period?\n\n3. According to the WHO TR 1033 Annex 4 Guideline, what constitutes an audit trail in the context of GxP records, and what key information should it provide to facilitate the reconstruction of the history of events relating to the record?", "prev_section_summary": "The section discusses the increasing observations related to data integrity issues in GMP, GCP, GLP, and GTDP inspections, citing reasons such as inadequate human practices, poorly defined procedures, and resource constraints. It emphasizes the importance of data governance and management in ensuring the reliability of data and records in GxP activities. The document provides guidance on strengthening data integrity in pharmaceutical preparations, covering electronic, paper, and hybrid systems. It also addresses the responsibilities of contract givers and acceptors in ensuring data integrity. The section highlights the principles of data governance, the attributes of reliable data and records (ALCOA+), and the scope of the guidelines for good data and record management practices.", "excerpt_keywords": "Data Integrity, WHO TR 1033, ALCOA+, Archiving, Audit Trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\n2.4. where possible, this guideline has been harmonised with other published documents on data integrity. this guideline should also be read with other who good practices guidelines and publications including, but not limited to, those listed in the references section of this document.\n\n### glossary\n\nthe definitions given below apply to the terms used in these guidelines. they may have different meanings in other contexts.\n\n|alcoa+|a commonly used acronym for \"attributable, legible, contemporaneous, original and accurate\" which puts additional emphasis on the attributes of being complete, consistent, enduring and available throughout the data life cycle for the defined retention period.|\n|---|---|\n|archiving|archiving is the process of long-term storage and protection of records from the possibility of deterioration, and being altered or deleted, throughout the required retention period. archived records should include the complete data, for example, paper records, electronic records including associated metadata such as audit trails and electronic signatures. within a glp context, the archived records should be under the control of independent data management personnel throughout the required retention period.|\n|audit trail|the audit trail is a form of metadata containing information associated with actions that relate to the creation, modification or deletion of gxp records. an audit trail provides for a secure recording of life cycle details such as creation, additions, deletions or alterations of information in a record, either paper or electronic, without obscuring or overwriting the original record. an audit trail facilitates the reconstruction of the history of such events relating to the record regardless of its medium, including the \"who, what, when and why\" of the action.|\n|backup|the copying of live electronic data, at defined intervals, in a secure manner to ensure that the data are available for restoration.|\n|certified true copy or true copy|a copy (irrespective of the type of media used) of the original record that has been verified (i.e. by a dated signature or by generation through a validated process) to have the same information, including data that describe the context, content, and structure, as the original.|\n|data|all original records and true copies of original records, including source data and metadata, and all subsequent transformations and reports of these data which are generated or recorded at the time of the gmp activity and which|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "32ac266a-f9a2-48c6-94f5-ee54f90bd4ba": {"__data__": {"id_": "32ac266a-f9a2-48c6-94f5-ee54f90bd4ba", "embedding": null, "metadata": {"page_label": "4", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Governance in Pharmaceutical Preparations: Quality and Safety through GMP Activities", "questions_this_excerpt_can_answer": "1. What are the various media formats recognized by the WHO guidelines for recording information related to GMP activities in pharmaceutical preparations, and how do they contribute to ensuring data integrity?\n\n2. How does the WHO TR 1033 Annex 4 Guideline define \"data governance\" in the context of pharmaceutical preparations, and what are the key elements that ensure data quality throughout the data life cycle?\n\n3. What role does a Data Integrity Risk Assessment (DIRA) play according to the WHO guidelines in managing the integrity of data within the pharmaceutical industry, and what steps are involved in this process?", "prev_section_summary": "The section discusses the WHO TR 1033 Annex 4 Guideline on data integrity, focusing on key concepts such as ALCOA+ principles, archiving of GxP records, audit trails, backup procedures, and certified true copies. It emphasizes the importance of ensuring data accuracy, consistency, and availability throughout the data life cycle for the defined retention period. The guidelines provide recommendations for maintaining the integrity of records, including the secure storage and protection of data, the creation of audit trails to track actions related to records, and the verification of true copies to ensure data authenticity.", "excerpt_keywords": "WHO guidelines, data integrity, pharmaceutical preparations, GMP activities, data governance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\nallow full and complete reconstruction and evaluation of the gmp activity. data should be accurately recorded by permanent means at the time of the activity. data may be contained in paper records (such as worksheets and logbooks), electronic records and audit trails, photographs, microfilm or microfiche, audio or video files or any other media whereby information related to gmp activities is recorded.\n\ndata criticality. this is defined by the importance of the data for the quality and safety of the product and how important data are for a quality decision within production or quality control.\n\ndata governance. the sum total of arrangements which provide assurance of data quality. these arrangements ensure that data, irrespective of the process, format or technology in which it is generated, recorded, processed, retained, retrieved and used will ensure an attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring and available record throughout the data life cycle.\n\ndata integrity risk assessment (dira). the process to map out procedures, systems and other components that generate or obtain data; to identify and assess risks and implement appropriate controls to prevent or minimize lapses in the integrity of the data.\n\ndata life cycle. all phases of the process by which data are created, recorded, processed, reviewed, analysed and reported, transferred, stored and retrieved and monitored, until retirement and disposal. there should be a planned approach to assessing, monitoring and managing the data and the risks to those data, in a manner commensurate with the potential impact on patient safety, product quality and/or the reliability of the decisions made throughout all phases of the data life cycle.\n\ndynamic data. dynamic formats, such as electronic records, allow an interactive relationship between the user and the record content. for example, electronic records in database formats allow the user to track, trend and query data; chromatography records maintained as electronic records allow the user or reviewer (with appropriate access permissions) to reprocess the data and expand the baseline to view the integration more clearly.\n\nelectronic signatures. a signature in digital form (bio-metric or non-biometric) that represents the signatory. in legal terms, it is the equivalent of the handwritten signature of the signatory.\n\nwho technical report series, no. 1033, 2021good practices (gxp). an acronym for the group of good practice guides governing the preclinical, clinical, manufacturing, testing, storage, distribution", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "22ec3058-6fd4-483c-86b9-97cf91ace780": {"__data__": {"id_": "22ec3058-6fd4-483c-86b9-97cf91ace780", "embedding": null, "metadata": {"page_label": "5", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Governance in Regulated Pharmaceuticals, Biologicals, and Medical Devices", "questions_this_excerpt_can_answer": "1. What specific examples of metadata are considered crucial for understanding and evaluating the meaning of data within the context of regulated pharmaceuticals, biologicals, and medical devices, according to the WHO TR 1033 Annex 4 guidelines?\n\n2. How does the WHO TR 1033 Annex 4 guideline define \"raw data\" in the context of data integrity for regulated pharmaceuticals, biologicals, and medical devices, and what is its significance in the first-capture of information?\n\n3. What are the responsibilities of senior management in ensuring data integrity according to the WHO TR 1033 Annex 4 guideline, particularly in relation to the implementation of systems and procedures to minimize and identify risks to data integrity?", "prev_section_summary": "The section discusses the importance of data integrity and governance in pharmaceutical preparations, as outlined in the WHO TR 1033 Annex 4 Guideline. Key topics include various media formats for recording GMP activities, data criticality, data governance, Data Integrity Risk Assessment (DIRA), data life cycle, dynamic data, and electronic signatures. The section emphasizes the need for accurate and permanent recording of data, assurance of data quality, risk assessment to prevent data integrity lapses, and the management of data throughout its life cycle to ensure patient safety and product quality.", "excerpt_keywords": "Data integrity, Governance, Pharmaceuticals, Biologicals, Medical devices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\nand post-market activities for regulated pharmaceuticals, biologicals and medical devices, such as glp, gcp, gmp, good pharmacovigilance practices (gvp) and good distribution practices (gdp).\n\nhybrid system. the use of a combination of electronic systems and paper systems.\n\nmedical product. a term that includes medicines, vaccines, diagnostics and medical devices.\n\nmetadata. metadata are data that provide the contextual information required to understand other data. these include structural and descriptive metadata, which describe the structure, data elements, interrelationships and other characteristics of data. they also permit data to be attributable to an individual. metadata that are necessary to evaluate the meaning of data should be securely linked to the data and subject to adequate review. for example, in the measurement of weight, the number 8 is meaningless without metadata, such as, the unit, milligram, gram, kilogram, and so on. other examples of metadata include the time or date stamp of an activity, the operator identification (id) of the person who performed an activity, the instrument id used, processing parameters, sequence files, audit trails and other data required to understand data and reconstruct activities.\n\nraw data. the original record (data) which can be described as the first-capture of information, whether recorded on paper or electronically. raw data is synonymous with source data.\n\nstatic data. a static record format, such as a paper or electronic record, that is fixed and allows little or no interaction between the user and the record content. for example, once printed or converted to static electronic format chromatography records lose the capability of being reprocessed or enabling more detailed viewing of baseline.\n\n## data governance\n\n4.1. there should be a written policy on data integrity.\n\n4.2. senior management should be accountable for the implementation of systems and procedures in order to minimise the potential risk to data integrity, and to identify the residual risk using risk management techniques such as the principles of the guidance on quality risk management from who (5) and the international council for harmonisation of technical requirements for pharmaceuticals for human use (ich) (6).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a38e5cf8-2206-46d5-abe0-9ac38e286ec5": {"__data__": {"id_": "a38e5cf8-2206-46d5-abe0-9ac38e286ec5", "embedding": null, "metadata": {"page_label": "6", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Governance in Pharmaceutical Preparations: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What specific responsibilities does senior management have in establishing and maintaining a data governance system within pharmaceutical preparations, according to the WHO Technical Report Series No. 1033, 2021?\n \n2. How does the WHO guideline on data integrity recommend addressing the segregation of duties and roles within the data governance framework to ensure compliance with data integrity principles in the pharmaceutical industry?\n\n3. What are the recommended control strategies outlined in the WHO Technical Report Series No. 1033, 2021, for mitigating risks to data integrity within pharmaceutical organizations, and how do these strategies incorporate quality risk management principles?", "prev_section_summary": "The section discusses the importance of data integrity and governance in regulated pharmaceuticals, biologicals, and medical devices according to the WHO TR 1033 Annex 4 guidelines. Key topics include metadata, raw data, hybrid systems, medical products, and static data. The section emphasizes the need for contextual information (metadata) to understand data, defines raw data as the original record or source data, and outlines the responsibilities of senior management in implementing systems and procedures to minimize risks to data integrity. The importance of a written policy on data integrity and the use of risk management techniques are also highlighted.", "excerpt_keywords": "data governance, senior management, alcoa+ principles, quality risk management, pharmaceutical preparations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n#### who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n4.3. senior management is responsible for the establishment, implementation and control of an effective data governance system. data governance should be embedded in the quality system. the necessary policies, procedures, training, monitoring and other systems should be implemented.\n\n4.4. data governance should ensure the application of alcoa+ principles.\n\n4.5. senior management is responsible for providing the environment to establish, maintain and continually improve the quality culture, supporting the transparent and open reporting of deviations, errors or omissions and data integrity lapses at all levels of the organization. appropriate, immediate action should be taken when falsification of data is identified. significant lapses in data integrity that may impact patient safety, product quality or efficacy should be reported to the relevant medicine regulatory authorities.\n\n4.6. the quality system, including documentation such as procedures and formats for recording and reviewing of data, should be appropriately designed and implemented in order to provide assurance that records and data meet the principles contained in this guideline.\n\n4.7. data governance should address the roles, responsibilities, accountability and define the segregation of duties throughout the life cycle and consider the design, operation and monitoring of processes/systems to comply with the principles of data integrity, including control over authorized and unauthorized changes to data.\n\n4.8. data governance control strategies using quality risk management (qrm) principles (5) are required to prevent or mitigate risks. the control strategy should aim to implement appropriate technical, organizational and procedural controls. examples of controls may include, but are not limited to:\n\n- the establishment and implementation of procedures that will facilitate compliance with data integrity requirements and expectations;\n- the adoption of a quality culture within the company that encourages personnel to be transparent about failures, which includes a reporting mechanism inclusive of investigation and follow-up processes;\n- the implementation of appropriate controls to eliminate or reduce risks to an acceptable level throughout the life cycle of the data;\n\nwho technical report series, no. 1033, 2021 - ensuring sufficient time and resources are available to implement and complete a data integrity programme; to monitor compliance", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0d654b22-ab0e-442a-a395-5966b3cb35ce": {"__data__": {"id_": "0d654b22-ab0e-442a-a395-5966b3cb35ce", "embedding": null, "metadata": {"page_label": "7", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Governance and Integrity Policies and Procedures Document", "questions_this_excerpt_can_answer": "1. What specific strategies does the WHO TR 1033 Annex 4 Guideline recommend for enhancing data integrity through personnel management in pharmaceutical environments?\n \n2. How does the WHO TR 1033 Annex 4 Guideline propose to manage and improve data governance systems, particularly in the context of error reporting and continual improvement feedback mechanisms?\n\n3. What are the key components of a data governance programme as outlined in the WHO TR 1033 Annex 4 Guideline, especially in relation to compliance with data protection legislation and the management of data integrity anomalies?", "prev_section_summary": "The section discusses the responsibilities of senior management in establishing and maintaining a data governance system within pharmaceutical preparations, as outlined in the WHO Technical Report Series No. 1033, 2021. It emphasizes the importance of embedding data governance in the quality system, ensuring the application of ALCOA+ principles, and providing a quality culture that supports transparent reporting of deviations and data integrity lapses. The section also highlights the need for appropriate controls, segregation of duties, and the use of quality risk management principles to mitigate risks to data integrity. Key topics include the establishment of procedures for compliance, fostering a quality culture, implementing controls to reduce risks, and ensuring sufficient time and resources for a data integrity program. Key entities mentioned are senior management, data governance, quality system, ALCOA+ principles, quality risk management, and medicine regulatory authorities.", "excerpt_keywords": "WHO TR 1033, data integrity, pharmaceutical environments, data governance, quality system"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\nwith data integrity policies, procedures and processes through e.g. audits and self-inspections; and to facilitate continuous improvement of both;\n\n- the assignment of qualified and trained personnel and provision of regular training for personnel in, for example, gxp, and the principles of data integrity in computerized systems and manual/paper based systems;\n- the implementation and validation of computerized systems appropriate for their intended use, including all relevant data integrity requirements in order to ensure that the computerized system has the necessary controls to protect the electronic data (3); and\n- the definition and management of the appropriate roles and responsibilities for contract givers and contract acceptors, entered into quality agreements and contracts including a focus on data integrity requirements.\n\n4.9. data governance systems should include, for example:\n\n- the creation of an appropriate working environment;\n- active support of continual improvement in particular based on collecting feedback; and\n- review of results, including the reporting of errors, unauthorized changes, omissions and undesirable results.\n\n4.10. the data governance programme should include policies and procedures addressing data management. these should at least where applicable, include:\n\n- management oversight and commitment;\n- the application of qrm;\n- compliance with data protection legislation and best practices;\n- qualification and validation policies and procedures;\n- change, incident and deviation management;\n- data classification, confidentiality and privacy;\n- security, cybersecurity, access and configuration control;\n- database build, data collection, data review, blinded data, randomization;\n- the tracking, trending, reporting of data integrity anomalies, and lapses or failures for further action;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d9605f2e-0fa1-4295-8361-c65ddd8cd11e": {"__data__": {"id_": "d9605f2e-0fa1-4295-8361-c65ddd8cd11e", "embedding": null, "metadata": {"page_label": "8", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Controls in Pharmaceutical Preparations: Ensuring Accuracy and Security in the Manufacturing Process", "questions_this_excerpt_can_answer": "1. What specific principles does the WHO TR 1033 Annex 4 Guideline recommend for the regular review of data to ensure consistency with ALCOA+ principles in the context of pharmaceutical preparations?\n \n2. According to the WHO TR 1033 Annex 4 Guideline, how should the effort and resources dedicated to assuring data integrity be determined within the pharmaceutical manufacturing process?\n\n3. What are the specific requirements outlined in the WHO TR 1033 Annex 4 Guideline for maintaining records (both paper and electronic) to ensure compliance with data integrity controls in the pharmaceutical industry?", "prev_section_summary": "The section discusses the importance of data integrity policies, procedures, and processes in pharmaceutical environments, as outlined in the WHO TR 1033 Annex 4 Guideline. Key topics include personnel management, training, validation of computerized systems, roles and responsibilities in contracts, data governance systems, continual improvement, error reporting, compliance with data protection legislation, data management policies and procedures, and tracking and reporting of data integrity anomalies. Key entities mentioned include qualified and trained personnel, computerized systems, contract givers and acceptors, management oversight, qrm, data protection legislation, security measures, and data classification.", "excerpt_keywords": "WHO TR 1033, data integrity, pharmaceutical preparations, ALCOA+ principles, compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n- the prevention of commercial, political, financial and other organizational pressures;\n- adequate resources and systems;\n- workload and facilities to facilitate the right environment that supports di and effective controls;\n- monitoring;\n- record-keeping;\n- training; and\n- awareness of the importance of data integrity, product quality and patient safety.\n\n4.11. there should be a system for the regular review of data for consistency with alcoa+ principles. this includes paper records and electronic records in day-to-day work, system and facility audits and self-inspections.\n\n4.12. the effort and resources applied to assure the integrity of the data should be commensurate with the risk and impact of a data integrity failure.\n\n4.13. where weaknesses in data integrity are identified, the appropriate corrective and preventive actions (capa) should be implemented across all relevant activities and systems and not in isolation.\n\n4.14. changing from paper-based systems to automated or computerised systems (or vice-versa) will not in itself remove the need for appropriate data integrity controls.\n\n4.15. records (paper and electronic) should be kept in a manner that ensures compliance with the principles of this guideline. these include but are not limited to:\n\n- ensuring time accuracy of the system generating the record, accurately configuring and verifying time zone and time synchronisation, and restricting the ability to change dates, time zones and times for recording events;\n- using controlled documents and forms for recording gxp data;\n- defining access and privilege rights to gxp automated and computerized systems, ensuring segregation of duties;\n- ensuring audit trail activation for all interactions and restricting the ability to enable or disable audit trails (note: back-end changes and hard changes, such as hard deletes, should not be allowed). where audit trials can be disabled then this action should also appear in the audit trail;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a8f82915-1f64-475e-88eb-58c881cd7cf9": {"__data__": {"id_": "a8f82915-1f64-475e-88eb-58c881cd7cf9", "embedding": null, "metadata": {"page_label": "9", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Quality Risk Management in GxP Activities: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific strategies does the WHO TR 1033 Annex 4 Guideline recommend for minimizing unnecessary data transcription and conversion between paper and electronic formats in GxP activities?\n\n2. How does the WHO TR 1033 Annex 4 Guideline propose to ensure the integrity and effectiveness of data recording and storage systems in GxP environments, particularly in relation to the adoption of new technologies?\n\n3. What approach does the WHO TR 1033 Annex 4 Guideline suggest for conducting Data Integrity Risk Assessments (DIRA) within GxP activities, including the factors to consider and the frequency of risk review?", "prev_section_summary": "The section discusses the importance of data integrity controls in pharmaceutical preparations, as outlined in the WHO TR 1033 Annex 4 Guideline. Key topics include the principles recommended for the regular review of data to ensure consistency with ALCOA+ principles, determining the effort and resources needed for data integrity assurance, maintaining records (both paper and electronic) to comply with data integrity controls, implementing corrective and preventive actions for identified weaknesses, and ensuring compliance with guidelines for record-keeping, training, and awareness of data integrity, product quality, and patient safety. The section emphasizes the need for regular data review, appropriate resource allocation based on risk, and proper record-keeping practices to uphold data integrity in pharmaceutical manufacturing processes.", "excerpt_keywords": "data integrity, quality risk management, GxP activities, WHO TR 1033, guideline"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\n- having automated data capture systems and printers connected to equipment and instruments in production (such as supervisory control and data acquisition (scada), human machine interface (hmi) and programme logic control (plcs) systems), in quality control, and in clinical research (such as clinical data management (cdm) systems), where possible;\n\n- designing processes in a way to avoid the unnecessary transcription of data or unnecessary conversion from paper to electronic and vice versa; and\n\n- ensuring the proximity of an official gxp time source to site of gxp activity and record creation.\n\n4.16. systems, procedures and methodology used to record and store data should be periodically reviewed for effectiveness. these should be updated throughout the data life cycle, as necessary, where new technology becomes available. new technology implementation must be evaluated before implementation to verify the impact on data integrity.\n\n### 5. quality risk management\n\nnote: documentation of data flows and data process maps are recommended to facilitate the assessment, mitigation and control of data integrity risks across the actual and intended data process(es).\n\n5.1. data integrity risk assessment (dira) should be carried out in order to identify and assess areas of risk. this should cover systems and processes that produce data or, where data are obtained and inherent risks. the diras should be risk-based, cover the life cycle of data and consider data criticality. data criticality may be determined by considering how the data is used to influence the decisions made. the diras should be documented and reviewed, as required, to ensure that it remains current.\n\n5.2. the risk assessments should evaluate, for example, the relevant gxp computerised systems, supporting personnel, training, quality systems and outsourced activities.\n\n5.3. di risks should be assessed and mitigated. controls and residual risks should be communicated. risk review should be done throughout the document and data life cycle at a frequency based on the risk level, as determined by the risk assessment process.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6ecfdd37-d09c-428b-b11c-bd02fb59c2dc": {"__data__": {"id_": "6ecfdd37-d09c-428b-b11c-bd02fb59c2dc", "embedding": null, "metadata": {"page_label": "10", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity Compliance in Pharmaceutical Preparations: A Comprehensive Guide to Risk Assessment, Controls, and Management Review", "questions_this_excerpt_can_answer": "1. What specific actions does the WHO guideline recommend for managing areas identified as needing remedial action to ensure data integrity in pharmaceutical preparations, according to the fifty-fifth report of the WHO Expert Committee on Specifications for Pharmaceutical Preparations?\n\n2. Can you detail the types of controls the WHO TR 1033 Annex 4 guideline suggests implementing to prevent and detect situations that may compromise data integrity in the pharmaceutical industry?\n\n3. How does the WHO TR 1033 Annex 4 guideline propose evaluating the effectiveness of controls implemented to support data integrity compliance, especially in relation to computerized systems and software within the pharmaceutical sector?", "prev_section_summary": "The section discusses strategies recommended by the WHO TR 1033 Annex 4 Guideline for minimizing unnecessary data transcription and conversion between paper and electronic formats in GxP activities. It also covers the importance of automated data capture systems, avoiding unnecessary data transcription, and ensuring the proximity of an official gxp time source. Additionally, it emphasizes the need for periodic review of systems, procedures, and methodology used to record and store data, as well as the implementation of new technology to verify its impact on data integrity. The section also highlights the importance of conducting Data Integrity Risk Assessments (DIRA) to identify and assess areas of risk, covering systems, processes, data criticality, and the evaluation of relevant factors such as computerized systems, personnel, training, quality systems, and outsourced activities. Controls and residual risks should be communicated, and risk review should be done throughout the document and data life cycle at a frequency based on the risk level.", "excerpt_keywords": "WHO, data integrity, pharmaceutical preparations, risk assessment, controls"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n5.4. where the risk assessment has highlighted areas for remedial action, the prioritisation of actions (including the acceptance of an appropriate level of residual risk) and the prioritisation of controls should be documented and communicated. where long-term remedial actions are identified, risk-reducing short-term measures should be implemented in order to provide acceptable data governance in the interim.\n\n5.5. controls identified may include organizational, procedural and technical controls such as procedures, processes, equipment, instruments and other systems in order to both prevent and detect situations that may impact on data integrity. examples include the appropriate content and design of procedures, formats for recording, access control, the use of computerized systems and other means.\n\n5.6. efficient risk-based controls should be identified and implemented to address risks impacting data integrity. risks include, for example, the deletion of, changes to and exclusion of data or results from data sets without written justification, authorisation where appropriate, and detection. the effectiveness of the controls should be verified (see appendix 1 for examples).\n\n## management review\n\n6.1. management should ensure that systems (such as computerized systems and paper systems) are meeting regulatory requirements in order to support data integrity compliance.\n\n6.2. the acquisition of non-compliant computerized systems and software should be avoided. where existing systems do not meet current requirements, appropriate controls should be identified and implemented based on risk assessment.\n\n6.3. the effectiveness of the controls implemented should be evaluated through, for example:\n\n- the tracking and trending of data;\n- a review of data, metadata and audit trails (e.g. in warehouse and material management, production, quality control, case report forms and data processing); and\n- routine audits and/or self-inspections, including data integrity and computerized systems.\n\nwho technical report series, no. 1033, 2021\n\n144", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6c7f01ee-7f4c-48f8-9fd2-7c9092028f9d": {"__data__": {"id_": "6c7f01ee-7f4c-48f8-9fd2-7c9092028f9d", "embedding": null, "metadata": {"page_label": "11", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Compliance in Outsourcing and Training Operations", "questions_this_excerpt_can_answer": "1. What specific considerations should be made when drafting written agreements for outsourcing activities to ensure compliance with data integrity requirements according to the WHO TR 1033 Annex 4 guidelines?\n\n2. How does the WHO TR 1033 Annex 4 guideline recommend ensuring the integrity and confidentiality of data when data and document retention responsibilities are outsourced to a third party?\n\n3. What are the training requirements outlined in the WHO TR 1033 Annex 4 guideline for personnel who interact with GxP data and perform GxP activities, especially in relation to computerized systems?", "prev_section_summary": "The section discusses the WHO Expert Committee on Specifications for Pharmaceutical Preparations' recommendations for ensuring data integrity compliance in pharmaceutical preparations. Key topics include risk assessment, prioritization of actions and controls, organizational, procedural, and technical controls, risk-based controls, management review, compliance with regulatory requirements, acquisition of compliant computerized systems, evaluation of control effectiveness through tracking and trending of data, review of data and audit trails, and routine audits/self-inspections for data integrity and computerized systems. Key entities mentioned include the WHO Expert Committee, management, computerized systems, procedures, processes, equipment, instruments, data integrity, and regulatory requirements.", "excerpt_keywords": "Outsourcing, Data integrity, Compliance, Training, GxP activities"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\n7. outsourcing\n\n7.1. the selection of a contract acceptor should be done in accordance with an authorized procedure. the outsourcing of activities, ownership of data, and responsibilities of each party (contract giver and contract accepter) should be clearly described in written agreements. specific attention should be given to ensuring compliance with data integrity requirements. provisions should be made for responsibilities relating to data when an agreement expires.\n\n7.2. compliance with the principles and responsibilities should be verified during periodic site audits. this should include the review of procedures and data (including raw data and metadata, paper records, electronic data, audit trails and other related data) held by the relevant contract accepter identified in risk assessment.\n\n7.3. where data and document retention are contracted to a third party, particular attention should be given to security, transfer, storage, access and restoration of data held under that agreement, as well as controls to ensure the integrity of data over their life cycle. this includes static data and dynamic data. mechanisms, procedures and tools should be identified to ensure data integrity and data confidentiality, for example, version control, access control, and encryption.\n\n7.4. gxp activities, including outsourcing of data management, should not be sub-contracted to a third party without the prior approval of the contract giver. this should be stated in the contractual agreements.\n\n7.5. all contracted parties should be aware of the requirements relating to data governance, data integrity and data management.\n\n8. training\n\n8.1. all personnel who interact with gxp data and who perform gxp activities should be trained in relevant data integrity principles and abide by organization policies and procedures. this should include understanding the potential consequences in cases of non-compliance.\n\n8.2. personnel should be trained in good documentation practices and measures to prevent and detect data integrity issues.\n\n8.3. specific training should be given in cases where computerized systems are used in the generation, processing, interpretation and reporting of data.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "24e707bb-2f56-4d73-9a7b-a9f3450736b7": {"__data__": {"id_": "24e707bb-2f56-4d73-9a7b-a9f3450736b7", "embedding": null, "metadata": {"page_label": "12", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Effective Management in Pharmaceutical Preparations", "questions_this_excerpt_can_answer": "1. What specific training topics are recommended by the WHO expert committee for personnel involved in managing GxP computerized systems to ensure data integrity in pharmaceutical preparations?\n \n2. How does the WHO TR 1033 Annex 4 Guideline address the integrity of data captured in non-traditional formats such as photographs, videos, and thin layer chromatography plates within the pharmaceutical industry?\n\n3. What principles and considerations does the guideline recommend for ensuring data integrity during the data transfer or migration processes in the context of pharmaceutical preparations?", "prev_section_summary": "The section discusses the importance of outsourcing activities in accordance with authorized procedures, including clear descriptions of ownership of data and responsibilities in written agreements to ensure compliance with data integrity requirements. It emphasizes the need for periodic site audits to verify compliance, particularly when data and document retention are outsourced to a third party. The guidelines also highlight the training requirements for personnel interacting with GxP data, emphasizing the importance of understanding data integrity principles, good documentation practices, and training on using computerized systems for data generation and processing. Overall, the section focuses on ensuring data integrity and compliance in outsourcing and training operations according to the WHO TR 1033 Annex 4 guidelines.", "excerpt_keywords": "WHO, data integrity, pharmaceutical preparations, GxP computerized systems, data transfer"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\nwhere risk assessment has shown that this is required to relevant personnel. such training should include validation of computerized systems and for example, system security assessment, back-up, restoration, disaster recovery, change and configuration management, and reviewing of electronic data and metadata, such as audit trails and logs, for each gxp computerized systems used in the generation, processing and reporting of data.\n\n### 9. data, data transfer and data processing\n\n9.1. data may be recorded on paper or captured electronically by using equipment and instruments including those linked to computerised systems. a combination of paper and electronic formats may also be used, referred to as a \"hybrid system\".\n\n9.2. data integrity considerations are also applicable to media such as photographs, videos, dvds, imagery and thin layer chromatography plates. there should be a documented rationale for the selection of such a method.\n\n9.3. risk-reducing measures such as scribes, second person oversight, verification and checks should be implemented where there is difficulty in accurately and contemporaneously recording data related to critical process parameters or critical quality attributes.\n\n9.4. results and data sets require independent verification if deemed necessary from the dira or by another requirement.\n\n9.5. programmes and methods (such as processing methods in sample analysis (see also good chromatography practices, trs 1025) should ensure that data meet alcoa+ principles. where results or data are processed using a different method/parameters, then each version of the processing method should be recorded. data records, content versions together with audit trails containing the required details should allow for reconstruction of all data processing in gxp computerized systems over the data life cycle.\n\n9.6. data transfer/migration procedures should include a rationale and be robustly designed and validated to ensure that data integrity is maintained during the data life cycle. careful consideration should be given to understanding the data format and the potential for alteration at each stage of data generation, transfer and subsequent storage. the challenges of migrating data are often underestimated, particularly regarding maintaining the full meaning of the migrated records.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "071a396b-4d24-4dd8-abf7-d7a4b448fc34": {"__data__": {"id_": "071a396b-4d24-4dd8-abf7-d7a4b448fc34", "embedding": null, "metadata": {"page_label": "13", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Transfer Validation and Good Documentation Practices in Paper Records: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific measures are recommended by the WHO TR 1033 Annex 4 Guideline to ensure the integrity of data transferred to worksheets or other applications, and how should changes in middle layer software be managed to maintain data integrity?\n\n2. According to the WHO TR 1033 Annex 4 Guideline, what are the specific good documentation practices that should be implemented to comply with ALCOA+ principles, especially in relation to the physical characteristics of paper and ink used for recording data?\n\n3. How does the WHO TR 1033 Annex 4 Guideline propose to control and maintain the integrity of raw data and results recorded on paper records, including the management of changes and archival processes?", "prev_section_summary": "The section discusses the importance of training personnel involved in managing GxP computerized systems to ensure data integrity in pharmaceutical preparations. It also addresses data integrity considerations for various formats such as photographs, videos, and thin layer chromatography plates. The section emphasizes the need for risk-reducing measures, independent verification of results, and ensuring data meet alcoa+ principles. Additionally, it highlights the importance of robustly designed and validated data transfer/migration procedures to maintain data integrity throughout the data life cycle.", "excerpt_keywords": "WHO TR 1033, data integrity, data transfer validation, good documentation practices, paper records"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\ndata transfer should be validated. the data should not be altered during or after it is transferred to the worksheet or other application. there should be an audit trail for this process. the appropriate quality procedures should be followed if the data transfer during the operation has not occurred correctly. any changes in the middle layer software should be managed through the appropriate quality management systems (7).\n\n## good documentation practices\n\nnote: the principles contained in this section are applicable to paper data.\n\n|10.1.|good documentation practices should be implemented and enforced to ensure compliance with alcoa+ principles.|\n|---|---|\n|10.2.|data and recorded media should be durable. ink should be indelible. temperature-sensitive or photosensitive inks and other erasable inks should not be used. where related risks are identified, means should be identified in order to ensure traceability of the data over their life cycle.|\n|10.3.|paper should not be temperature-sensitive, photosensitive or easily oxidizable. if this is not feasible or limited, then true or certified copies should be generated.|\n|10.4.|specific controls should be implemented in order to ensure the integrity of raw data and results recorded on paper records. these may include, but are not limited to: - control over the issuance and use of loose paper sheets at the time of recording data;\n- no use of pencil or erasers;\n- use of single-line cross-outs to record changes with the identifiable person who made the change, date and reason for the change recorded (i.e. the paper equivalent to an electronic audit trail);\n- no use of correction fluid or otherwise, obscuring the original record;\n- controlled issuance of bound, paginated notebooks;\n- controlled issuance and reconciliation of sequentially numbered copies of blank forms with authenticity controls;\n- maintaining a signature and initial record for traceability and defining the levels of signature of a record; and\n- archival of records by designated personnel in secure and controlled archives.\n|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4d372e8d-741b-43f4-a33b-2a76c8512a5e": {"__data__": {"id_": "4d372e8d-741b-43f4-a33b-2a76c8512a5e", "embedding": null, "metadata": {"page_label": "14", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Guidelines for the Use of Computerized Systems in Pharmaceutical Preparations: Ensuring Accuracy, Efficiency, and Compliance", "questions_this_excerpt_can_answer": "1. What specific guidance does the WHO TR 1033 Annex 4 Guideline provide regarding the validation and maintenance of computerized systems in pharmaceutical preparations?\n \n2. How does the document address the management of data integrity risks throughout the data lifecycle in the context of Good Manufacturing Practice (GMP) systems?\n\n3. What recommendations does the guideline make for ensuring data integrity when using electronic instruments or systems without configurable software and electronic data retention in pharmaceutical settings?", "prev_section_summary": "The section discusses data transfer validation and good documentation practices in paper records according to the WHO TR 1033 Annex 4 Guideline. Key topics include ensuring data integrity during transfer, managing changes in middle layer software, implementing good documentation practices to comply with ALCOA+ principles, using durable and indelible ink, controlling issuance and use of paper sheets, avoiding erasable inks and correction fluid, maintaining traceability of data, and archival processes. Key entities mentioned include audit trails, quality procedures, middle layer software, ink, paper, raw data, results, loose paper sheets, pencils, erasers, notebooks, blank forms, signatures, archival personnel, and secure archives.", "excerpt_keywords": "WHO, computerized systems, data integrity, pharmaceutical preparations, validation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n11. computerized systems\n(note. this section highlights some specific aspects relating to pe use of computerized systems. it is not intended to repeat pe information presented in pe oper who guidelines here, such as pe who guideline on computerized systems (3), who guideline on validation (2) and who guideline on good chromatography practices (7). see references.)\n11.1. each computerized system selected should be suitable, validated for its intended use, and maintained in a validated state.\n11.2. where gxp systems are used to acquire, record, transfer, store or process data, management should have appropriate knowledge of pe risks pat pe system and users may pose to pe integrity of pe data.\n11.3. software of computerized systems, used wip gxp instruments and equipment, should be appropriately configured (where required) and validated. the validation should address for example pe design, implementation and maintenance of controls in order to ensure pe integrity of manually and automatically acquired data; ensure pat good documentation practices will be implemented; and pat data integrity risks will be appropriately managed proughout pe data life cycle. the potential for unauporized and adverse manipulation of data during pe life cycle of pe data should be mitigated and, where possible, eliminated.\n11.4. where electronic instruments (e.g. certain ph meters, balances and permometers) or systems wip no configurable software and no electronic data retention are used, controls should be put in place to prevent pe adverse manipulation of data and to prevent repeat testing to achieve pe desired result.\n11.5. appropriate controls for pe detection of lapses in data integrity principles should be in place. technical controls should be used whenever possible but additional procedural or administrative controls should be implemented to manage aspects of computerised system control where technical controls are missing. for example, when stand-alone computerized systems wip a user-configurable output are used, fourier-transform infrared spectroscopy (ftir) and uv spectrophotometers have user-configurable output or reports pat cannot be controlled using technical controls. oper examples of non-technical detection and prevention mechanisms may include, but are not limited to, instrument who technical report series, no. 1033, 2021usage logbooks and electronic audit trails.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a27dca37-0d94-423f-a50e-1c22fe7ce184": {"__data__": {"id_": "a27dca37-0d94-423f-a50e-1c22fe7ce184", "embedding": null, "metadata": {"page_label": "15", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Access Control and Audit Trail Requirements for GxP Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific measures does the WHO TR 1033 Annex 4 guideline recommend for managing user access and privileges in systems handling GxP data to ensure data integrity?\n \n2. How does the guideline address the issue of system administrators' privileges and the oversight of their activities to prevent conflicts of interest and ensure the integrity of GxP data?\n\n3. What are the guideline's recommendations regarding the use of shared logins or generic user access for systems generating, amending, or storing GxP data, and what alternatives does it suggest if a computerized system does not support individual user access?", "prev_section_summary": "The section discusses the use of computerized systems in pharmaceutical preparations, emphasizing the importance of selecting suitable and validated systems that are maintained in a validated state. It highlights the need for management to have appropriate knowledge of the risks to data integrity posed by the system and users. The document also addresses the validation and configuration of software in computerized systems, as well as the prevention of unauthorized data manipulation. Recommendations are provided for ensuring data integrity when using electronic instruments or systems without configurable software. Additionally, the section emphasizes the importance of implementing controls for detecting lapses in data integrity and suggests using technical, procedural, or administrative controls as needed.", "excerpt_keywords": "WHO TR 1033, Annex 4, data integrity, access control, audit trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\naccess and privileges\n\n11.6. there should be a documented system in place that defines the access and privileges of users of systems. there should be no discrepancy between paper records and electronic records where paper systems are used to request changes for the creation and inactivation of users. inactivated users should be retained in the system. a list of active and inactivated users should be maintained throughout the system life cycle.\n\n11.7. access and privileges should be in accordance with the role and responsibility of the individual with the appropriate controls to ensure data integrity (e.g. no modification, deletion or creation of data outside the defined privilege and in accordance with the authorized procedures defining review and approval where appropriate).\n\n11.8. a limited number of personnel, with no conflict of interest in data, should be appointed as system administrators. certain privileges such as data deletion, database amendment or system configuration changes should not be assigned to administrators without justification - and such activities should only be done with documented evidence of authorization by another responsible person. records should be maintained and audit trails should be enabled in order to track activities of system administrators. as a minimum, activity logging for such accounts and the review of logs by designated roles should be conducted in order to ensure appropriate oversight.\n\n11.9. for systems generating, amending or storing gxp data, shared logins or generic user access should not be used. the computerised system design should support individual user access. where a computerised system supports only a single user login or limited numbers of user logins and no suitable alternative computerised system is available, equivalent control should be provided by third-party software or a paper-based method that provides traceability (with version control). the suitability of alternative systems should be justified and documented (8). the use of legacy hybrid systems should be discouraged and a priority timeline for replacement should be established.\n\naudit trail\n\n11.10. gxp systems should provide for the retention of audit trails. audit trails should reflect, for example, users, dates, times, original data and results, changes and reasons for changes (when required to be recorded), and enabling and disenabling of audit trails.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ccf6e729-0494-4b92-b3bd-d59732341a8b": {"__data__": {"id_": "ccf6e729-0494-4b92-b3bd-d59732341a8b", "embedding": null, "metadata": {"page_label": "16", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Security in Pharmaceutical Preparations: Best Practices and Measures", "questions_this_excerpt_can_answer": "1. What specific measures does the WHO TR 1033 Annex 4 Guideline recommend for ensuring audit trails in GxP relevant software systems remain enabled and verifiable throughout the data life cycle?\n\n2. How does the guideline address the challenge of maintaining data integrity in legacy systems that cannot support ALCOA+ principles due to design limitations, and what are the recommended temporary mitigation measures?\n\n3. What criteria does the guideline set for the control, creation, and retention of electronic signatures in pharmaceutical preparations to ensure they are attributable, unalterable, and securely linked to their respective records?", "prev_section_summary": "The section discusses access control and audit trail requirements for GxP systems, as outlined in the WHO TR 1033 Annex 4 guideline. Key topics include defining user access and privileges, maintaining consistency between paper and electronic records, assigning appropriate access based on roles and responsibilities, appointing system administrators with limited privileges and no conflicts of interest, prohibiting shared logins or generic user access for systems handling GxP data, and ensuring audit trails are retained to track system activities. Key entities mentioned include users, system administrators, data integrity controls, audit trails, and GxP data.", "excerpt_keywords": "WHO, data integrity, pharmaceutical preparations, audit trails, electronic signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n|11.11.|all gxp relevant audit trails should be enabled when software is installed and remain enabled at all times. there should be evidence of enabling the audit trail. there should be periodic verification to ensure that the audit trail remains enabled throughout the data life cycle.|\n|---|---|\n|11.12.|where a system cannot support alcoa+ principles by design (e.g. legacy systems with no audit trail), mitigation measures should be taken for defined temporary periods. for example, add-on software or paper-based controls may be used. the suitability of alternative systems should be justified and documented. this should be addressed within defined timelines.|\n\n### electronic signatures\n\n11.13. each electronic signature should be appropriately controlled by, for example, senior management. an electronic signature should be:\n\n- attributable to an individual;\n- free from alteration and manipulation\n- be permanently linked to their respective record; and\n- date- and time-stamped.\n\n11.14. an inserted image of a signature or a footnote indicating that the document has been electronically signed is not adequate unless it was created as part of the validated electronic signature process. the metadata associated with the signature should be retained.\n\n### data backup, retention and restoration\n\n11.15. data should be retained (archived) in accordance with written policies and procedures, and in such a manner that they are protected, enduring, readily retrievable and remain readable throughout the records retention period. true copies of original records may be retained in place of the original record, where justified. electronic data should be backed up according to written procedures.\n\n11.16. data and records, including backup data, should be kept under conditions which provide appropriate protection from deterioration. access to such storage areas should be controlled and should be accessible only by authorized personnel.\n\n11.17. data retention periods should be defined in authorized procedures.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "40204e42-2368-4d32-be84-98b2d7634869": {"__data__": {"id_": "40204e42-2368-4d32-be84-98b2d7634869", "embedding": null, "metadata": {"page_label": "17", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Management and Integrity Procedures in GXP Compliance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific procedures should be followed for the destruction of data and records in compliance with GXP guidelines, and how should records of such destruction be maintained according to the WHO TR 1033 Annex 4 guideline?\n\n2. How does the WHO TR 1033 Annex 4 guideline recommend validating backup and restoration processes for GXP data, including the frequency and verification measures for ensuring data and metadata integrity?\n\n3. What are the documented requirements for the review and approval of GXP data and metadata, including the consideration of audit trails, as outlined in the WHO TR 1033 Annex 4 guideline?", "prev_section_summary": "The key topics of this section include ensuring audit trails in GxP relevant software systems remain enabled and verifiable, addressing data integrity in legacy systems that cannot support ALCOA+ principles, criteria for electronic signatures in pharmaceutical preparations, data backup, retention, and restoration measures. The entities involved in these topics are WHO expert committee on specifications for pharmaceutical preparations, senior management, individuals responsible for electronic signatures, and authorized personnel responsible for data retention and storage.", "excerpt_keywords": "Data integrity, GXP compliance, WHO TR 1033, Data management, Audit trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\n11.18. the decision for and manner in which data and records are destroyed, should be described in written procedures. records for the destruction should be maintained.\n\n11.19. backup and restoration processes should be validated. the backup should be done routinely and periodically be restored and verified for completeness and accuracy of data and metadata. where any discrepancies are identified, they should be investigated and appropriate action taken.\n\n12. data review and approval\n\n12.2. there should be a documented procedure for the routine and periodic review, as well as the approval of data. personnel with appropriate knowledge and experience should be responsible for reviewing and checking data. they should have access to original electronic data and metadata.\n\n12.3. the routine review of gxp data and meta data should include audit trails. factors such as criticality of the system (high impact versus low impact) and category of audit trail information (e.g. batch specific, administrative, system activities, and so on) should be considered when determining the frequency of the audit trail review.\n\n12.4. a procedure should describe the actions to be taken where errors, discrepancies or omissions are identified in order to ensure that the appropriate corrective and preventive actions are taken.\n\n12.5. evidence of the review should be maintained.\n\n12.6. a conclusion, where required, following the review of original data, metadata and audit trail records should be documented, signed and dated.\n\n13. corrective and preventive actions\n\n13.1. where organizations use computerized systems (e.g. for gxp data acquisition, processing, interpretation, reporting) which do not meet current gxp requirements, an action plan towards upgrading such systems should be documented and implemented in order to ensure compliance with current gxp.\n\n13.2. when lapses in gxp relevant data regarding data integrity are identified, a risk-based approach may be used to determine the scope of the", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9721d091-235e-4e6a-b485-8a5197d2229d": {"__data__": {"id_": "9721d091-235e-4e6a-b485-8a5197d2229d", "embedding": null, "metadata": {"page_label": "18", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Guidelines on Pharmaceutical Preparations and Good Manufacturing Practices: Ensuring Data Integrity and Quality Risk Management by WHO Expert Committee", "questions_this_excerpt_can_answer": "1. What specific WHO technical report series numbers are associated with guidelines on good manufacturing practices for pharmaceutical products and their validation, as mentioned in the WHO Expert Committee on Specifications for Pharmaceutical Preparations reports from 2013 and 2019?\n \n2. How does the WHO Expert Committee on Specifications for Pharmaceutical Preparations address the issue of data integrity in the context of pharmaceutical preparations and good manufacturing practices, as outlined in their fifty-fifth report?\n\n3. What are the key references and further reading materials recommended by the WHO Expert Committee for understanding data integrity, quality risk management, and good manufacturing practices in the pharmaceutical industry, as detailed in their fifty-fifth report?", "prev_section_summary": "The section discusses procedures for data destruction, validation of backup and restoration processes, review and approval of GXP data and metadata, and corrective and preventive actions for ensuring data integrity in compliance with GXP guidelines. Key topics include the description of data destruction procedures, validation of backup processes, routine review and approval of data, consideration of audit trails, and actions to be taken in case of errors or discrepancies. Key entities mentioned are personnel responsible for data review, original electronic data and metadata, audit trails, and corrective and preventive action plans for upgrading systems to meet current GXP requirements.", "excerpt_keywords": "WHO, Expert Committee, Specifications, Pharmaceutical Preparations, Data Integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\ninvestigation, root cause, impact and capa, as appropriate. health authorities, contract givers and other relevant organizations should be notified if the investigation identifies a significant impact or risk to, for example, materials, products, patients, reported information or data in application dossiers, and clinical trials.\n\n### references\n\n|1.|guidelines on good manufacturing practices for pharmaceutical products: main principle.|in: who expert committee on specifications for pharmaceutical preparations: forty-eighth report. geneva: world health organization; 2013: annex 2 (who technical report series, no. 986; link, accessed 4 may 2020).|\n|---|---|---|\n|2.|good manufacturing practices: guidelines on validation.|in: who expert committee on specifications for pharmaceutical preparations; fifty-third report. geneva: world health organization; 2019: annex 3 (who technical report series, no. 1019; link, accessed 5 may 2020).|\n|3.|good manufacturing practices: guidelines on validation. appendix 5. validation of computerized systems.|in: who expert committee on specifications for pharmaceutical preparations: fifty-third report. geneva: world health organization; 2019: annex 3 (who technical report series, no. 1019; link, accessed 4 may 2020).|\n|4.|guidelines on quality risk management.|in: who expert committee on specifications for pharmaceutical preparations: forty-seventh report. geneva: world health organization; 2013: annex 2 (who technical report series, no. 981; link, accessed 4 may 2020).|\n|5.|ich harmonised tripartite guideline. quality risk management q9.|geneva: international conference on harmonisation of technical requirements for registration of pharmaceutical for human use; 2005 (link, accessed 12 june 2020).|\n|6.|good chromatography practices.|in: who expert committee on specifications for pharmaceutical preparations: fifty-fourth report. geneva: world health organization; 2020: annex 4 (who technical report series, no. 1025; link, accessed 12 june 2020).|\n|7.|mhra gxp data integrity guidance and definitions; revision 1: medicines & healthcare products regulatory agency (mhra), london, march 2018|(link, accessed 12 june 2020).|\n\n### further reading\n\ndata integrity and compliance with cgmp guidance for industry: questions and answers guidance for industry. u.s. department of health and human services, food and drug administration; who technical report series, no. 1033, 20212016 (link, accessed 15 june 2020).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9809410f-9132-4ac7-bc52-ec813fe0926f": {"__data__": {"id_": "9809410f-9132-4ac7-bc52-ec813fe0926f", "embedding": null, "metadata": {"page_label": "19", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Best Practices for Data Management and Integrity in Regulated Pharmaceutical Environments", "questions_this_excerpt_can_answer": "1. What are the key resources or guidelines mentioned for ensuring data management and integrity in regulated GMP/GDP environments as of November 2018?\n \n2. Can you provide the publication dates and sources for the guidelines and technical reports that support best practices in pharmaceutical data integrity and risk-based manufacturing as referenced in the document?\n\n3. What specific international guideline is mentioned for the pharmaceutical quality system, including its publication year and the URL where it can be accessed as of October 2020?", "prev_section_summary": "The section discusses the guidelines on data integrity and quality risk management in pharmaceutical preparations and good manufacturing practices as outlined by the WHO Expert Committee on Specifications for Pharmaceutical Preparations in their fifty-fifth report. It covers the importance of investigation, root cause analysis, impact assessment, and corrective and preventive actions in case of data integrity issues. The section also provides references to key WHO technical report series numbers related to good manufacturing practices and validation, as well as further reading materials on data integrity and compliance with CGMP guidance. Key entities mentioned include health authorities, contract givers, and relevant organizations involved in ensuring data integrity and quality in the pharmaceutical industry.", "excerpt_keywords": "data management, data integrity, GMP, GDP, pharmaceutical industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\ngood practices for data management and integrity in regulated gmp/gdp environments.\npharmaceutical inspection convention and pharmaceutical inspection co-operation scheme\n(pic/s), november 2018 (https://picscheme.org/layout/document.php?id=1567, accessed 15\njune 2020).\nbaseline guide vol 7: risk-based manufacture of pharma products; 2nd edition.\nispe baseline (r) guide, july 2017. ispegamp (r) guide: records and data integrity; march 2017.\ndata integrity management system for pharmaceutical laboratories pda technical report, no. 80;\naugust 2018.\nich harmonised tripartite guideline. pharmaceutical quality system q10. geneva: international\nconference on harmonisation of technical requirements for registration of pharmaceutical for\nhuman use; 2008 (https://database.ich.org/sites/default/files/q10%20guideline.pdf, accessed\n2 october 2020).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2380fc44-48cd-4034-aebf-74aa7961149e": {"__data__": {"id_": "2380fc44-48cd-4034-aebf-74aa7961149e", "embedding": null, "metadata": {"page_label": "20", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Integrity Management in Pharmaceutical Preparations: Ensuring Accuracy and Security in Drug Manufacturing and Distribution", "questions_this_excerpt_can_answer": "1. How does the WHO TR 1033 Annex 4 Guideline on Data Integrity recommend assessing and managing risks related to data integrity in pharmaceutical manufacturing, specifically using the example of a failure mode and effects analysis (FMEA)?\n \n2. In the context of data integrity management within pharmaceutical preparations, what specific example does the guideline provide to illustrate a low-risk data integrity lapse involving the recording of dates during sample weighing, and how should such a situation be addressed according to the guideline?\n\n3. What principles does the WHO TR 1033 Annex 4 Guideline emphasize regarding good documentation practices to prevent data integrity issues in the pharmaceutical industry, and how does it suggest these practices can help in eliminating erroneous entries, manipulation, and human error?", "prev_section_summary": "The section discusses key resources and guidelines for ensuring data management and integrity in regulated GMP/GDP environments as of November 2018. It mentions the Pharmaceutical Inspection Convention and Pharmaceutical Inspection Co-operation Scheme (PIC/S), the ISPE Baseline Guide, the ISPE GAMP Guide, the PDA Technical Report, and the ICH Harmonised Tripartite Guideline on Pharmaceutical Quality System Q10. These resources provide best practices for data management and integrity in pharmaceutical manufacturing, with a focus on risk-based approaches and quality systems.", "excerpt_keywords": "Data Integrity, Pharmaceutical Preparations, WHO TR 1033, Risk Management, Good Documentation Practices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\n### appendix 1\n\n### examples in data integrity management\n\nthis appendix reflects on some examples in data integrity management in order to support the main text on data integrity. it should be noted that these are examples and are intended for the purpose of clarification only.\n\n#### example 1: quality risk management and data integrity risk assessment\n\nrisk management is an important part of good practices (gxp). risks should be identified and assessed and controls identified and implemented in order to assist manufacturers in preventing possible di lapses.\n\nas an example, a failure mode and effects analysis (fmea) model (or any other tool) can be used to identify and assess the risks relating to any system where data are, for example, acquired, processed, recorded, saved and archived. the risk assessment can be done as a prospective exercise or retrospective exercise. corrective and preventive action (capa) should be identified, implemented and assessed for its effectiveness.\n\nfor example, if during the weighing of a sample, the entry of the date was not contemporaneously recorded on the worksheet but the date is available on the print-out from a weighing balance and log book for the balance for that particular activity. the fact that the date was not recorded on the worksheet may be considered a lapse in data integrity expectations. when assessing the risk relating to the lack of the date in the data, the risk may be considered different (lower) in this case as opposed to a situation when there is no other means of traceability for the activity (e.g. no print-out from the balance). when assessing the risk relating to the lapse in data integrity, the severity could be classified as \"low\" (the data is available on the print-out); it does not happen on a regular basis (occurrence is \"low\"), and it could easily be detected by the reviewer (detection is \"high\") - therefore the overall risk factor may be considered low. the root cause as to why the record was not made in the analytical report at the time of weighing should still be identified and the appropriate action taken to prevent this from happening again.\n\n#### example 2: good documentation practices in data integrity\n\ndocumentation should be managed with care. these should be appropriately designed in order to assist in eliminating erroneous entries, manipulation and human error.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3ffdb664-f1f8-4e2f-a25f-0a30de5cd50d": {"__data__": {"id_": "3ffdb664-f1f8-4e2f-a25f-0a30de5cd50d", "embedding": null, "metadata": {"page_label": "21", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Entry and Record Keeping Best Practices for Candidate Titles and Content", "questions_this_excerpt_can_answer": "1. What specific design considerations does the WHO TR 1033 Annex 4 Guideline recommend for formats used in recording or entering data to ensure the integrity of information, especially in a computerized system?\n\n2. According to the WHO TR 1033 Annex 4 Guideline, what are the recommended controls and procedures for using blank sheets of paper in documentation processes to prevent unauthorized use and ensure traceability?\n\n3. How does the WHO TR 1033 Annex 4 Guideline suggest handling errors in data recording to maintain the integrity of the original entry while ensuring the correction is properly documented and traceable?", "prev_section_summary": "The section discusses examples of data integrity management in pharmaceutical preparations, focusing on risk assessment using tools like Failure Mode and Effects Analysis (FMEA) and emphasizing the importance of good documentation practices. It provides an example of a low-risk data integrity lapse involving the recording of dates during sample weighing and suggests corrective actions to address such situations. The section also highlights the need for identifying and assessing risks related to data integrity, implementing controls, and emphasizing the role of documentation in preventing errors and manipulation in the pharmaceutical industry.", "excerpt_keywords": "WHO TR 1033, data integrity, record keeping, documentation practices, pharmaceutical industry"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\nformats\n\ndesign formats to enable personnel to record or enter the correct information contemporaneously. provision should be made for entries such as, but not limited to, dates, times (start and finish time, where appropriate), signatures, initials, results, batch numbers and equipment identification numbers. when a computerized system is used, the system should prompt the personnel to make the entries at the appropriate step.\n\nblank sheets of paper\n\nthe use of blank sheets should not be encouraged. where blank sheets are used (e.g. to supplement worksheets, laboratory notebooks and master production and control records), the appropriate controls have to be in place and may include, for example, a numbered set of blank sheets issued which are reconciled upon completion. similarly, bound paginated notebooks, stamped or formally issued by designated personnel, allow for the detection of unofficial notebooks and any gaps in notebook pages. authorization may include two or three signatures with dates, for example, \"prepared by\" or \"entered by\", \"reviewed by\" and \"approved by\".\n\nerror in recording data\n\ncare should be taken when entries of data and results (electronic and paper records) are made. entries should be made in compliance with good documentation practices. where incorrect information had been recorded, this may be corrected provided that the reason for the error is documented, the original entry remains readable and the correction is signed and dated.\n\nexample 3: data entry\n\ndata entry includes for example sample receiving registration, sample analysis result recording, logbook entries, registers, batch manufacturing record entries and information in case report forms. the recording of source data on paper records should be done using indelible ink, in a way that is complete, accurate, traceable, attributable and free from errors. direct entry into electronic records should be done by responsible and appropriately trained individuals. entries should be traceable to an individual (in electronic records, thus having an individual user access) and traceable to the date (and time, where relevant). where appropriate, the entry should be verified by a second person or entered through technical means such as the scanning of bar-codes, where possible, for the intended use of these data. additional controls may include the locking of critical data entries after the data are verified and a review of audit trails for", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4935ef5b-5813-4b20-bf09-5fa0da6cb12c": {"__data__": {"id_": "4935ef5b-5813-4b20-bf09-5fa0da6cb12c", "embedding": null, "metadata": {"page_label": "22", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Long-Term Accessibility in Pharmaceutical Preparations: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidelines does the WHO TR 1033 Annex 4 provide for handling out-of-specification or atypical results in pharmaceutical data sets?\n \n2. How does the WHO TR 1033 Annex 4 Guideline recommend ensuring the long-term readability and accessibility of electronic pharmaceutical data and metadata, including considerations for software compatibility and media deterioration?\n\n3. According to the WHO TR 1033 Annex 4 Guideline, what procedures should be followed for the archival and retrieval of electronic data in the pharmaceutical industry to maintain data integrity and ensure that metadata remains securely traceable to the relevant data set?", "prev_section_summary": "The section discusses the importance of designing formats for data entry to ensure accuracy and integrity, including the inclusion of key information such as dates, times, signatures, and batch numbers. It also emphasizes the need for controls when using blank sheets of paper, such as issuing numbered sets and obtaining multiple signatures for authorization. Additionally, the section addresses the handling of errors in data recording, emphasizing the importance of documenting the reason for the error, ensuring the original entry remains readable, and signing and dating any corrections. The excerpt also provides examples of data entry processes and the importance of using indelible ink for paper records and traceable electronic entry methods.", "excerpt_keywords": "WHO, TR 1033, Annex 4, data integrity, pharmaceutical preparations"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## who expert committee on specifications for pharmaceutical preparations fifty-fifth report\n\ncritical data to detect if they have been altered. the manual entry of data from a paper record into a computerized system should be traceable to the paper records used which are kept as original data.\n\nexample 4: dataset\n\nall data should be included in the dataset unless there is a documented, justifiable, scientific explanation and procedure for the exclusion of any result or data. whenever out of specification or out of trend or atypical results are obtained, they should be investigated in accordance with written procedures. this includes investigating and determining capa for invalid runs, failures, repeats and other atypical data. the review of original electronic data should include checks of all locations where data may have been stored, including locations where voided, deleted, invalid or rejected data may have been stored. data and metadata related to a particular test or product should be recorded together. the data should be appropriately stored in designated folders. the data should not be stored in other electronic folders or in other operating system logs. electronic data should be archived in accordance with a standard operating procedure. it is important to ensure that associated metadata are archived with the relevant data set or securely traceable to the data set through relevant documentation. it should be possible to successfully retrieve all required data and metadata from the archives. the retrieval and verification should be done at defined intervals and in accordance with an authorized procedure.\n\nexample 5: legible and enduring\n\ndata and metadata should be readable during the life cycle of the data. electronic data are normally only legible/readable through the original software application that created it. in addition, there may be restrictions around the version of a software application that can read the data. when storing data electronically, ensure that any restrictions which may apply and the ability to read the electronic data are understood. clarification from software vendors should be sought before performing any upgrade, or when switching to an alternative application, to ensure that data previously created will be readable.\n\nother risks include the fading of microfilm records, the decreasing readability of the coatings of optical media such as compact disks (cds) and digital versatile/video disks (dvds), and the fact that these media may become brittle. similarly, historical data stored on magnetic media will also become unreadable over time as a result of deterioration. data and records should be stored in an appropriate manner, under the appropriate conditions.\n\n156who technical report series, no. 1033, 2021", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "05b644bf-3603-4ba2-9c82-912b1d8fe914": {"__data__": {"id_": "05b644bf-3603-4ba2-9c82-912b1d8fe914", "embedding": null, "metadata": {"page_label": "23", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Traceability and Contemporaneous Recordkeeping: A Guide to Ensuring Compliance and Accuracy", "questions_this_excerpt_can_answer": "1. What specific methods are recommended by the WHO TR 1033 Annex 4 Guideline for ensuring data is attributable to an individual or measurement system in both paper and electronic records?\n \n2. How does the guideline address the recording of data and information to ensure it is contemporaneous, and what are the recommendations for using hybrid systems or a scribe in data recording processes?\n\n3. According to the WHO TR 1033 Annex 4 Guideline, how should the process for supervisory (scribe) documentation completion be managed, and what does it say about the traceability of employees involved in data recording and their actions?", "prev_section_summary": "The section discusses guidelines provided by the WHO TR 1033 Annex 4 for ensuring data integrity and long-term accessibility in pharmaceutical preparations. Key topics include handling out-of-specification or atypical results in data sets, archival and retrieval procedures for electronic data, ensuring the readability and accessibility of electronic data and metadata, and considerations for software compatibility and media deterioration. Entities mentioned include the WHO Expert Committee on Specifications for Pharmaceutical Preparations, data sets, metadata, electronic data, software applications, and storage media such as microfilm, optical disks, and magnetic media.", "excerpt_keywords": "WHO TR 1033, data integrity, traceability, contemporaneous recordkeeping, electronic signatures"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\nexample 6: attributable\n\ndata should be attributable, thus being traceable to an individual and where relevant, the measurement system. in paper records, this could be done through the use of initials, full handwritten signature or a controlled personal seal. in electronic records, this could be done through the use of unique user logons that link the user to actions that create, modify or delete data; or unique electronic signatures which can be either biometric or non-biometric. an audit trail should capture user identification (id), date and time stamps and the electronic signature should be securely and permanently linked to the signed record.\n\nexample 7: contemporaneous\n\npersonnel should record data and information at the time these are generated and acquired. for example, when a sample is weighed or prepared, the weight of the sample (date, time, name of the person, balance identification number) should be recorded at that time and not before or at a later stage. in the case of electronic data, these should be automatically date- and time-stamped. in case hybrid systems are to be used, including the use for an interim period, the potential and criticality of system breaches should be covered in the assessment with documented mitigating controls in place. (the replacement of hybrid systems should be a priority with a documented capa plan.) the use of a scribe to record an activity on behalf of another operator should be considered only on an exceptional basis and should only take place where, for example, the act of recording places the product or activity at risk, such as, documenting line interventions by aseptic area operators. it needs to be clearly documented when a scribe has been applied.\n\n\"in these situations, the recording by the second person should be contemporaneous with the task being performed, and the records should identify both the person performing the task and the person completing the record. the person performing the task should countersign the record wherever possible, although it is accepted that this countersigning step will be retrospective. the process for supervisory (scribe) documentation completion should be described in an approved procedure that specifies the activities to which the process applies.\" (extract taken from the medicines & healthcare products regulatory agency (mhra)gxp data integrity guidance and definitions (10).)\n\na record of employees indicating, their name, signature, initials or other mark or seal used should be maintained to enable traceability and to uniquely identify them and the respective action.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b6b66b2e-5c5d-4a50-b13b-3670965e5597": {"__data__": {"id_": "b6b66b2e-5c5d-4a50-b13b-3670965e5597", "embedding": null, "metadata": {"page_label": "24", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Traceability in GxP Activities: Best Practices and Guidelines\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended by the WHO TR 1033 Annex 4 Guideline to ensure traceability when changes are made to GxP data or results?\n \n2. According to the WHO TR 1033 Annex 4 Guideline, what are the examples of controls that should be implemented based on the outcome of a risk assessment to assure that all data meets GxP requirements and ALCOA+ principles?\n\n3. How does the WHO TR 1033 Annex 4 Guideline suggest handling the original capture of data in GxP activities to ensure full reconstruction of the conduct is possible, and what are the examples provided for the first source of data?", "prev_section_summary": "The section discusses the importance of data traceability and contemporaneous recordkeeping in ensuring compliance and accuracy in both paper and electronic records. It highlights the methods recommended by the WHO TR 1033 Annex 4 Guideline for attributing data to individuals and measurement systems, such as using initials, signatures, or electronic logins. The guideline also addresses the recording of data at the time of generation, the use of scribes in data recording processes, and the need for clear documentation and traceability of employees involved in data recording. Overall, the section emphasizes the importance of maintaining accurate and traceable records to ensure data integrity.", "excerpt_keywords": "Data integrity, Traceability, GxP activities, WHO TR 1033, ALCOA+ principles"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## example 8: changes\n\nwhen changes are made to any gxp result or data, the change should be traceable to the person who made the change as well as the date, time and reason for the change. the original value should not be obscured. in electronic systems, this traceability should be documented via computer-generated audit trails or in other metadata fields or system features that meet these requirements. where an existing computerized system lacks computer-generated audit trails, personnel may use alternative means such as procedurally controlled use of log-books, change control, record version control or other combinations of paper and electronic records to meet gxp regulatory expectations for traceability to document the what, who, when and why of an action.\n\n## example 9: original\n\nthe first or source capture of data or information and all subsequent data required to fully reconstruct the conduct of the gxp activity should be available. in some cases, the electronic data (electronic chromatogram acquired through high-performance liquid chromatography (hplc)) may be the first source of data and, in other cases, the recording of the temperature on a log sheet in a room - by reading the value on a data logger. this data should be reviewed according to the criticality and risk assessment.\n\n## example 10: controls\n\nbased on the outcome of risk assessment which should cover all areas of data governance and data management, appropriate and effective controls should be identified and implemented in order to assure that all data, whether in paper records or electronic records, will meet gxp requirements and alcoa+ principles. examples of controls may include, but are not limited to:\n\n- the qualification, calibration and maintenance of equipment, such as balances and ph meters, that generate printouts;\n- the validation of computerized systems that acquire, process, generate, maintain, distribute, store or archive electronic records;\n- review and auditing of activities to ensure that these comply with applicable gxp data integrity requirements;\n- the validation of systems and their interfaces to ensure that the integrity of data will remain while transferring between/among computerized systems;\n- evaluation to ensure that computerized systems remain in a validated state;\n- the validation of analytical procedures;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c11a918f-a686-41d7-968b-83ab87513f56": {"__data__": {"id_": "c11a918f-a686-41d7-968b-83ab87513f56", "embedding": null, "metadata": {"page_label": "25", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Best Practices for Ensuring Accuracy and Control of GxP Records in Regulated Environments\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended by the WHO TR 1033 Annex 4 Guideline for ensuring the accuracy of critical data entries in GxP records, particularly regarding the entry of a master processing formula?\n \n2. How does the WHO TR 1033 Annex 4 Guideline suggest handling the validation and control of formulae for calculations and electronic data capture systems to maintain data integrity in regulated environments?\n\n3. According to the WHO TR 1033 Annex 4 Guideline, what are the recommended procedures for managing the migration of data between systems to ensure compliance with GxP records accuracy and control standards?", "prev_section_summary": "The section discusses the importance of ensuring data integrity and traceability in GxP activities, providing guidelines and best practices from the WHO TR 1033 Annex 4 Guideline. Key topics include the traceability of changes made to GxP data, the handling of original data capture, and the implementation of controls based on risk assessment outcomes. Entities mentioned include the need for documenting changes made to data, the availability of original data sources for reconstruction, and examples of controls such as equipment qualification, system validation, and review and auditing activities.", "excerpt_keywords": "WHO TR 1033, data integrity, GxP records, accuracy, control"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\n## annex 4\n\npoints to consider for assuring accurate gxp records:\n\n- the entry of critical data into a computer by an authorized person (e.g. entry of a master processing formula) requires an additional check on the accuracy of the data entered manually. this check may be done by independent verification and release for use by a second authorized person or by validated electronic means. for example, to detect and manage risks associated with critical data, procedures would require verification by a second person;\n- validation and control over formulae for calculations including electronic data capture systems;\n- ensuring correct entries into the laboratory information management system (lims) such as fields for specification ranges;\n- other critical master data, as appropriate. once verified, these critical data fields should normally be locked in order to prevent further modification and only be modified through a formal change control process;\n- the process of data transfer between systems should be validated;\n- the migration of data including planned testing, control and validation; and\n- when the activity is time-critical, printed records should display the date and time stamp.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "431b4ce7-f2a8-4008-9943-51ee63013725": {"__data__": {"id_": "431b4ce7-f2a8-4008-9943-51ee63013725", "embedding": null, "metadata": {"page_label": "26", "file_name": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf", "file_type": "application/pdf", "file_size": 307865, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Blank Canvas: A Collection of Unique Entities and Themes\"", "questions_this_excerpt_can_answer": "Based on the provided context, here are three questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What specific guidelines does the WHO TR 1033 Annex 4 provide regarding data integrity in the pharmaceutical development sector?**\n - This question targets the core content of the document mentioned in the context, which is likely to contain specialized guidelines on maintaining data integrity within pharmaceutical development. Given the specificity of the document's title and its relevance to a niche field, answers to this question would be uniquely detailed in the document.\n\n2. **How does the document \"Blank Canvas: A Collection of Unique Entities and Themes\" relate to the WHO TR 1033 Annex 4 Guideline on data integrity?**\n - This question seeks to understand the connection between the document's title, which suggests a broader thematic collection, and its specific focus on data integrity guidelines as per the WHO TR 1033 Annex 4. The answer would likely explore the thematic relevance or the inclusion of the guideline within a larger collection of works, which is a unique context provided by the excerpt.\n\n3. **What are the file characteristics and metadata details of the WHO TR 1033 Annex 4 Guideline on data integrity document as stored in the PharmaWise Engineer project?**\n - This question aims at extracting information about the document's digital attributes, such as file size, creation and modification dates, and storage location within a specific project directory. These details are uniquely available in the provided context and are unlikely to be found in other sources. This information could be particularly relevant for project management, document version control, and compliance tracking within the PharmaWise Engineer project.\n\nThese questions leverage the unique details provided in the context, focusing on content specificity, thematic relevance, and document metadata, which are areas directly addressed by the given excerpt.", "prev_section_summary": "The section discusses key measures recommended by the WHO TR 1033 Annex 4 Guideline for ensuring accuracy and control of GxP records in regulated environments. It highlights the importance of verifying critical data entries, controlling formulae for calculations and electronic data capture systems, managing data transfer between systems, and ensuring accurate entries into laboratory information management systems. The section emphasizes the need for validation, verification, and formal change control processes to maintain data integrity in GxP records.", "excerpt_keywords": "WHO TR 1033, Annex 4, data integrity, pharmaceutical development, guidelines"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[15] WHO TR 1033 Annex 4 Guideline on data integrity.pdf\nno_content_here", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "5728cbd7-2018-4b09-9076-7b6146bdc50d": {"__data__": {"id_": "5728cbd7-2018-4b09-9076-7b6146bdc50d", "embedding": null, "metadata": {"page_label": "1", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Requirements and Considerations for Computerised Systems in GMP Compliance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidance does Annex 11 of the EU GMP guide provide regarding the implementation of controls for electronic document templates, particularly for spreadsheets, to ensure accuracy and reliability?\n \n2. How does Annex 11 of the EU GMP guide address the validation requirements for spreadsheets, especially those containing custom code or algorithms, within the context of GMP compliance for computerised systems?\n\n3. According to the EU GMP guide's Annex 11, what considerations are recommended for ensuring data security in databases throughout their lifecycle, including during validation and potential data migration phases?", "excerpt_keywords": "EU GMP guide, Annex 11, Computerised systems, Validation requirements, Data security, Risk management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## eu gmp guide annexes: supplementary requirements: annex 11: computerised systems\n\n1. appropriate controls for electronic documents such as templates should be implemented. are there any specific requirements for templates of spreadsheets? h+v february 2011\n\ntemplates of spreadsheets help to avoid erroneous calculations from data remaining from previous calculations. they should be suitably checked for accuracy and reliability (annex 11 p7.1). they should be stored in a manner which ensures appropriate version control (chapter 4 p4.1).\n\n2. what type of accuracy checks (annex 11 p 6) are expected for the use of spreadsheets? h+v february 2011\n\ndata integrity should be ensured by suitably implemented and risk-assessed controls. the calculations and the files should be secured in such a way that formulations are not accidentally overwritten. accidental input of an inappropriate data type should be prevented or result in an error message (e.g. text in a numeric field or a decimal format into an integer field). so-called boundary checks are encouraged.\n\n3. are there any specific considerations for the validation of spreadsheets? h+v february 2011\n\nvalidation according to paragraph 4 of annex 11 is required at least for spreadsheets that contain custom code (e.g. visual basic for applications). formulas or other types of algorithm should be verified for correctness.\n\n4. what measures are required to ensure data security of databases? h+v february 2011\n\ndata security includes integrity, reliability and availability of data. during validation of a database-based or inclusive system, consideration should be given to:\n\n- implementing procedures and mechanisms to ensure data security and keeping the meaning and logical arrangement of data;\n- load-testing, taking into account future growth of the database and tools to monitor the saturation of the database;\n- precautions for necessary migration of data (annex 11 p17) at the end of the life-cycle of the system.\n\n5. at which phases of the system life-cycle is risk management recommended? h+v february 2011\n\nrisk management should be applied throughout the whole life-cycle. a first risk assessment should be performed to determine the gmp criticality of the system, i.e. does the system have an impact on patient safety, product quality or data integrity? user-requirement specifications are usually developed with consideration of potential risks and form the basis for the first formal risk assessment. complex systems should be evaluated in further more detailed risk assessments to determine critical functions. this will help ensure that validation activities cover all critical functions. risk management includes the implementation of appropriate controls and their verification.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8d810076-8b93-4240-803c-b4451848f4ff": {"__data__": {"id_": "8d810076-8b93-4240-803c-b4451848f4ff", "embedding": null, "metadata": {"page_label": "2", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Validation and Compliance Requirements for Computerised Systems and Small Devices: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific steps should be taken to validate small devices used in pharmaceutical manufacturing according to the EU GMP guide annexes, particularly in terms of vendor assessment and verification testing?\n \n2. How does the EU GMP guide annexes outline the process for conducting a retrospective validation of legacy computerised systems, including the necessity of defining user requirements and performing a gap analysis?\n\n3. According to the EU GMP guide annexes, what criteria determine the frequency of revalidation for computerised systems in the pharmaceutical industry, and what factors should be considered during periodic evaluations to ensure these systems remain in a validated state?", "prev_section_summary": "The section discusses the specific guidance provided in Annex 11 of the EU GMP guide regarding the implementation of controls for electronic document templates, particularly for spreadsheets, to ensure accuracy and reliability. It covers the requirements for accuracy checks, validation of spreadsheets containing custom code or algorithms, and considerations for data security in databases throughout their lifecycle. The key topics include appropriate controls for electronic documents, accuracy checks for spreadsheets, validation requirements, data security measures for databases, and the importance of risk management throughout the system life-cycle. Key entities mentioned include templates of spreadsheets, data integrity, validation procedures, data security mechanisms, risk management, and critical functions in complex systems.", "excerpt_keywords": "EU GMP guide, Annex 11, Computerised systems, Validation, Small devices"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## 6. are user requirements needed as part of the retrospective validation of legacy systems? h+v february 2011\n\nthe way to check whether a computerised system is fit for its intended purpose is to define user requirements and perform a gap analysis to determine the validation effort for retrospective validation. these user requirements should be verified.\n\n## 7. when do i have to revalidate computerised systems? h+v february 2011\n\ncomputerised systems should be reviewed periodically to confirm that they remain in a validated state. periodic evaluation should include, where applicable, the current range of functionality, deviation records, change records, upgrade history, performance, reliability and security. the time period for revaluation and revalidation should be based on the criticality of the system.\n\n## 8. what are the requirements for storage time of electronic data and documents? h+v february 2011\n\nthe requirements for storage of electronically data and documents do not differ from paper documents. it should be ensured that electronic signatures applied to electronic records are valid for the entire storage period for documents.\n\n## 9. what are the relevant validation efforts for small devices? h+v february 2011\n\nsmall devices are usually off-the-shelf pieces of equipment that is widely used. in these cases, the development life-cycle is mainly controlled by the vendor. the pharmaceutical customer should therefore reasonably assess the vendors capability of developing software according to common standards of quality. a vendor assessment needs to be performed and the application needs to be verified against the requirements for the intended use. from the perspective of the regulated industry, the implementation of such a device is driven by an implementation life-cycle. at minimum the following items need to be addressed:\n\n- requirement definition for the intended use including process limitations. this should also include a statement indicating whether data are stored or transferred to another system. as per the definition of a small device, data are not stored permanently but temporarily and are not to be modified by a user. therefore, limited user access handling is acceptable. it needs to be ensured that parameter data influencing the devices behaviour may not be altered without suitable permission;\n- risk assessment, taking into consideration the intended use and the risk to patients for associated with the process supported by the small device;\n- vendor assessment;\n- list of available documentation from the vendor, especially those describing the methodology used and the calculation algorithm, if applicable. a vendor certificate or equivalent detailing the testing performed by the vendor may also be included;\n- calibration certificate, if applicable;\n- validation plan according to the risk-assessment results;\n- verification testing proving that the device fulfills the requirements for the intended use. it may be equivalent to a pq-phase.\n\nsmall manufacturing devices are sometimes only equipped with microprocessors and firmware and are not capable of high-level administration functions. moreover, data is often transient in nature in these devices. due to the latter there is no risk of", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "31f4d830-13e3-4b6e-a3bc-73fb75c40bd9": {"__data__": {"id_": "31f4d830-13e3-4b6e-a3bc-73fb75c40bd9", "embedding": null, "metadata": {"page_label": "3", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Enhancing Data Security: Alternative Controls for Monitoring Data Modifications", "questions_this_excerpt_can_answer": "1. What measures can be taken if a computerized system is unable to automatically generate printouts that indicate changes made to data since its original entry, according to the EU GMP guide annexes?\n \n2. How does the EU GMP guide annexes address the issue of computerized systems that lack the capability to provide automated audit trails for data modifications, specifically in the context of supporting batch release documentation?\n\n3. In the scenario where a computerized system does not support the functionality of generating audit trail reports automatically, what alternative procedure is deemed acceptable by the EU GMP guide annexes to ensure data integrity and compliance during batch release?", "prev_section_summary": "This section discusses the validation and compliance requirements for computerised systems and small devices in pharmaceutical manufacturing according to the EU GMP guide annexes. Key topics include the need for user requirements in retrospective validation of legacy systems, the criteria for revalidation of computerised systems, storage requirements for electronic data and documents, and validation efforts for small devices. Entities mentioned include user requirements, gap analysis, periodic evaluation, electronic signatures, vendor assessment, risk assessment, calibration certificate, validation plan, and verification testing.", "excerpt_keywords": "EU GMP guide, Annex 11, Computerised systems, Data security, Audit trails"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\ninadvertently modifying data. an audit trail is therefore not necessary and user access may be limited to those functions of parameter control.\n\n10. what alternative controls are accepted in case a system is not capable to generate printouts indicating if any of the data has been changed since the original entry? h+v february 2011\n\nas long as this functionality is not supported by the supplier, it may be acceptable to describe in a procedure the fact that a print-out of the related audit trail report must be generated and linked manually to the record supporting batch release.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "98effcc6-a140-4126-935e-2a5f1150ce2c": {"__data__": {"id_": "98effcc6-a140-4126-935e-2a5f1150ce2c", "embedding": null, "metadata": {"page_label": "4", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in Pharmaceutical Manufacturing: Strategies for Regulatory Compliance", "questions_this_excerpt_can_answer": "1. How does the EU GMP Chapter 1 relate to the principles of data integrity in pharmaceutical manufacturing, and what role does it assign to senior management in ensuring data integrity?\n \n2. What specific strategies are recommended for assessing and mitigating data integrity risks within pharmaceutical manufacturing environments, according to the principles outlined in the PIC/S scheme and related regulatory guidance?\n\n3. How does the complexity and consistency of business processes impact the risk of data integrity failures in pharmaceutical manufacturing, and what considerations should be made when integrating manual interfaces with IT systems to minimize these risks?", "prev_section_summary": "The section discusses alternative controls for monitoring data modifications in computerized systems as outlined in the EU GMP guide annexes. Key topics include the lack of automated audit trails for data modifications, the need for manual generation of audit trail reports, and the importance of ensuring data integrity and compliance during batch release. The section also addresses the acceptable procedures for systems that do not support automated audit trail generation. Key entities mentioned include the supplier, users, and batch release documentation.", "excerpt_keywords": "EU GMP, data integrity, pharmaceutical manufacturing, regulatory compliance, risk assessment"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## data integrity\n\ndata integrity enables good decision-making by pharmaceutical manufacturers and regulatory authorities. it is a fundamental requirement of the pharmaceutical quality system described in eu gmp chapter 1, applying equally to manual (paper) and electronic systems.\n\npromotion of a quality culture together with implementation of organizational and technical measures which ensure data integrity is the responsibility of senior management. it requires participation and commitment by staff at all levels within the company, by the companys suppliers and by its distributors.\n\nsenior management should ensure that data integrity risk is assessed, mitigated and communicated in accordance with the principles of quality risk management. the effort and resource assigned to data integrity measures should be commensurate with the risk to product quality, and balanced with other quality assurance resource demands. where long term measures are identified in order to achieve the desired state of control, interim measures should be implemented to mitigate risk, and should be monitored for effectiveness.\n\nthe following questions and answers describe foundational principles which facilitate successful implementation of existing guidance published by regulatory authorities participating in the pic/s scheme. it should be read in conjunction with national guidance, medicines legislation and the gmp standards published in eudralex volume 4.\n\nthe importance of data integrity to quality assurance and public health protection should be included in personnel training programs.\n\n- who - annex 5: guidance on good data and record management practices\n\n### 1. how can data risk be assessed?\n\ndata risk assessment should consider the vulnerability of data to involuntary or deliberate amendment, deletion or recreation. control measures which prevent unauthorized activity and increase visibility / detectability can be used as risk mitigating actions.\n\nexamples of factors which can increase risk of data integrity failure include complex, inconsistent processes with open-ended and subjective outcomes. simple tasks which are consistent, well-defined and objective lead to reduced risk.\n\nrisk assessment should include a business process focus (e.g. production, qc) and not just consider it system functionality or complexity. factors to consider include:\n\n- process complexity\n- process consistency, degree of automation / human interface\n- subjectivity of outcome / result\n- is the process open-ended or well defined\n\nthis ensures that manual interfaces with it systems are considered in the risk assessment process. computerized system validation in isolation may not result in low data integrity risk, in particular when the user is able to influence the reporting of data from the validated system.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "23a87aaa-93fe-40e5-bd66-600e6a24d8cb": {"__data__": {"id_": "23a87aaa-93fe-40e5-bd66-600e6a24d8cb", "embedding": null, "metadata": {"page_label": "5", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Management Best Practices: Assessing Data Criticality, Managing Data Lifecycle, and Ensuring Data Integrity Measures", "questions_this_excerpt_can_answer": "1. How does the assessment of data criticality influence decision-making processes within the context of EU GMP guide annexes, specifically in relation to batch release decisions and the prioritization of data types such as compliance with critical quality attributes versus warehouse cleaning records?\n\n2. In the framework of EU GMP guide annexes, what specific elements and boundaries are involved in the data lifecycle of a product or process, including the transition across IT systems, quality system applications, and various organizational and external interfaces, particularly in the pharmaceutical industry?\n\n3. Why is the management of the data lifecycle considered crucial for maintaining data integrity within the pharmaceutical sector, according to the EU GMP guide annexes, and what specific stages and considerations are highlighted to ensure effective data integrity measures throughout the lifecycle of data?", "prev_section_summary": "The section discusses the importance of data integrity in pharmaceutical manufacturing, outlining the responsibilities of senior management in ensuring data integrity and the need for organizational and technical measures. It emphasizes the role of quality risk management in assessing and mitigating data integrity risks, and highlights the impact of business process complexity on data integrity failures. The section also addresses the need for training programs on data integrity and provides guidance on assessing data risk, including factors such as process complexity, consistency, and human interface. It stresses the importance of considering manual interfaces with IT systems in the risk assessment process to minimize data integrity risks.", "excerpt_keywords": "EU GMP guide, Annex 11, Computerised systems, Data criticality, Data lifecycle, Data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## 2. how can data criticality be assessed?\n\nthe decision which data influences may differ in importance, and the impact of the data to a decision may also vary. points to consider regarding data criticality include:\n\n- what decision does the data influence? for example: when making a batch release decision, data which determines compliance with critical quality attributes is of greater importance than warehouse cleaning records.\n- what is the impact of the data to product quality or safety? for example: for an oral tablet, active substance assay data is of greater impact to product quality and safety than tablet dimensions data.\n\n## 3. what does data lifecycle refer to?\n\ndata lifecycle refers to how data is generated, processed, reported, checked, used for decision-making, stored and finally discarded at the end of the retention period. data relating to a product or process may cross various boundaries within the lifecycle, for example:\n\n- it systems\n- quality system applications\n- production\n- analytical\n- stock management systems\n- data storage (back-up and archival)\n- organisational\n- internal (e.g. between production, qc and qa)\n- external (e.g. between contract givers and acceptors)\n- cloud-based applications and storage\n\n## 4. why is data lifecycle management important to ensure effective data integrity measures?\n\ndata integrity can be affected at any stage in the lifecycle. it is therefore important to understand the lifecycle elements for each type of data or record, and ensure controls which are proportionate to data criticality and risk at all stages.\n\n## 5. what should be considered when reviewing the data lifecycle?\n\nthe data lifecycle refers to the:\n\n- generation and recording of data\n- processing into usable information\n- checking the completeness and accuracy of reported data and processed information\n- data (or results) are used to make a decision\n- retaining and retrieval of data which protects it from loss or unauthorized amendment\n- retiring or disposal of data in a controlled manner at the end of its life", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6505eda7-0f1d-47b2-a143-31e31e6775a7": {"__data__": {"id_": "6505eda7-0f1d-47b2-a143-31e31e6775a7", "embedding": null, "metadata": {"page_label": "6", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Lifecycle Review and Risk Assessment in Computerised Systems: A Comprehensive Analysis", "questions_this_excerpt_can_answer": "1. How does EU GMP Annex 11 paragraph 4.3 contribute to the data lifecycle review process in computerised systems, and what role do business process owners and IT personnel play in this review?\n \n2. What specific risks and control measures should be considered regarding the creation, storage, and transfer of original data and metadata in computerised systems to ensure compliance with ALCOA principles and safeguard against data integrity failures?\n\n3. How does the guidance address the handling of data stored in temporary memory by computerised analytical and manufacturing equipment, including the risks associated with limited audit trail provision during this period, and what strategies are recommended to mitigate these risks?", "prev_section_summary": "This section discusses the assessment of data criticality, the concept of data lifecycle, the importance of data lifecycle management for ensuring data integrity measures, and considerations for reviewing the data lifecycle. Key topics include assessing data criticality for decision-making processes, understanding the data lifecycle from generation to disposal, and implementing controls proportionate to data criticality and risk. Entities mentioned include IT systems, quality system applications, production, analytical processes, stock management systems, data storage, organizational boundaries, and external interfaces in the pharmaceutical industry.", "excerpt_keywords": "EU GMP, Annex 11, Computerised systems, Data lifecycle, Data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\ndata lifecycle reviews are applicable to both paper and electronic records, although control measures may be applied differently. in the case of computerised systems, the data lifecycle review should be performed by business process owners (e.g. production, qc) in collaboration with it personnel who understand the system architecture. the description of computerised systems required by eu gmp annex 11 paragraph 4.3 can assist this review. the application of critical thinking skills is important to not only identify gaps in data governance, but to also challenge the effectiveness of the procedural and systematic controls in place.\n\nsegregation of duties between data lifecycle stages provides safeguards against data integrity failure by reducing the opportunity for an individual to alter, misrepresent or falsify data without detection. data risk should be considered at each stage of the data lifecycle review.\n\ndata lifecycle: what risks should be considered when assessing the generating and recording of data?\n\nthe following aspects should be considered when determining risk and control measures:\n\nhow and where is original data created (i.e. paper or electronic)\nwhat metadata is associated wip pe data, to ensure a complete, accurate and traceable record, taking into account alcoa principles. does pe record permit pe reconstruction of pe activity\nwhere is pe data and metadata located\ndoes pe system require pat data is saved to permanent memory at pe time of recording, or is it held in a temporary buffer\n\nin the case of some computerised analytical and manufacturing equipment, data may be stored as a temporary local file prior to transfer to a permanent storage location (e.g. server). during the period of temporary storage, there is often limited audit trail provision amending, deleting or recreating data. this is a data integrity risk. removing the use of temporary memory (or reducing the time period that data is stored in temporary memory) reduces the risk of undetected data manipulation.\n\nis it possible to recreate, amend or delete original data and metadata; controls over paper records are discussed elsewhere in this guidance. computerised system controls may be more complex, including setting of user privileges and system configuration to limit or prevent access to amend data. it is important to review all data access opportunities, including it helpdesk staff, who may make changes at the request of the data user. these changes should be procedurally controlled, visible and approved within the quality system.\n\nhow data is transferred to other locations or systems for processing or storage; data should be protected from possibility of intentional or unintentional loss or amendment during transfer to other systems (e.g. for processing, review or storage). paper records should be protected from amendment, or substitution. electronic interfaces should be validated to demonstrate security and no corruption of data, particularly where systems require an interface to present data in a different structure or file format.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a1470842-4cbd-45fb-9e71-6004b63c1ff0": {"__data__": {"id_": "a1470842-4cbd-45fb-9e71-6004b63c1ff0", "embedding": null, "metadata": {"page_label": "7", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Risk Management Strategies for Data Processing and Reporting\"", "questions_this_excerpt_can_answer": "1. What specific measures should be taken to ensure the integrity and version control of data processing methods in electronic data processing systems, according to the EU GMP guide annexes?\n \n2. How does the EU GMP guide annexes recommend handling situations where data has been processed multiple times to ensure each iteration is verifiable and maintains data integrity?\n\n3. According to the EU GMP guide annexes, what are the recommended practices for preserving the format of original data and ensuring it is accessible for a risk-based review by data reviewers?", "prev_section_summary": "The section discusses the importance of data lifecycle reviews in computerised systems, involving collaboration between business process owners and IT personnel. It highlights the risks and control measures related to the creation, storage, and transfer of data and metadata to ensure compliance with ALCOA principles and prevent data integrity failures. The handling of data stored in temporary memory by computerised equipment is also addressed, emphasizing the need for strategies to mitigate associated risks. Segregation of duties, data risk assessment, and controls over data access and transfer are key topics covered in the section.", "excerpt_keywords": "EU GMP guide, Annex 11, Computerised systems, Data integrity, Risk management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\ndata lifecycle: what risks should be considered when assessing the processing data into usable information?\n\nthe following aspects should be considered when determining risk and control measures:\nhow is data processed;\ndata processing mepods should be approved, identifiable and version controlled. in pe case of electronic data processing, mepods should be locked where appropriate to prevent unauporized amendment.\nhow is data processing recorded;\nthe processing mepod should be recorded. in situations where raw data has been processed more pan once, each iteration (including mepod and result) should be available to pe data checker for verification.\ndoes pe person processing pe data have pe ability to influence what data is reported, or how it is presented;\neven validated systems which do not permit pe user to make any changes to data may be at risk if pe user can choose what data is printed, reported or transferred for processing. this includes performing pe activity multiple times as separate events and reporting a desired outcome from one of pese repeats. data presentation (e.g. changing scale of graphical reports to enhance or reduce presentation of analytical peaks) can also influence decision making, and perefore impact data integrity.\n\ndata lifecycle: what risks should be considered when checking the completeness and accuracy of reported data and processed information?\n\nthe following aspects should be considered when determining risk and control measures:\nis original data (including pe original data format) available for checking;\nthe format of pe original data (electronic or paper) should be preserved, and available to pe data reviewer in a manner which permits interaction wip pe data (e.g. search, query). this approach facilitates a risk-based review of pe record, and can also reduce administrative burden for instance utilizing validated audit trail exception reports instead of an onerous line-by-line review.\nare pere any periods of time when data is not audit trailed;\nthis may present opportunity for data amendment which is not subsequently visible to pe data reviewer. additional control measures should be implemented to reduce risk of undisclosed data manipulation.\ndoes pe data reviewer have visibility and access to all data generated;", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "93bc0650-55fc-4fe0-bc1c-0156df9181d9": {"__data__": {"id_": "93bc0650-55fc-4fe0-bc1c-0156df9181d9", "embedding": null, "metadata": {"page_label": "8", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Security Throughout the Data Lifecycle: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. How does Annex 11 of the EU GMP guide address the issue of data integrity and security throughout the data lifecycle, specifically in relation to the visibility and handling of failed, aborted, or discrepant data activities?\n\n2. What specific risks and control measures does Annex 11 recommend considering during the data lifecycle to ensure the integrity of data when making pass/fail decisions, particularly in relation to the timing of these decisions and the visibility of changes in the audit trail?\n\n3. According to Annex 11, what are the recommended practices for the storage, backup, and protection against loss or unauthorized amendment of data within the pharmaceutical quality system to ensure data integrity and security throughout its lifecycle?", "prev_section_summary": "The section discusses risk management strategies for data processing and reporting in electronic data processing systems according to the EU GMP guide annexes. Key topics include ensuring integrity and version control of data processing methods, handling situations where data has been processed multiple times, preserving the format of original data for review, and ensuring data accuracy and completeness. Entities mentioned include data processing methods, data checkers, data reviewers, and control measures to mitigate risks related to data manipulation and integrity.", "excerpt_keywords": "EU GMP guide, Annex 11, data integrity, security, data lifecycle"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\nthis should include any data from failed or aborted activities, discrepant or unusual data which has been excluded from processing or the final decision-making process. visibility of all data provides protection against selective data reporting or testing into compliance.\n\n- does the data reviewer have visibility and access to all processing of data; this ensures that the final result obtained from raw data is based on good science, and that any data exclusion or changes to processing method is based on good science. visibility of all processing information provides protection against undisclosed processing into compliance.\n\n9. data lifecycle: what risks should be considered when data (or results) are used to make a decision?\n\nthe following aspects should be considered when determining risk and control measures:\n\n- when is the pass / fail decision taken; if data acceptability decisions are taken before a record (raw data or processed result) is saved to permanent memory, there may be opportunity for the user to manipulate data to provide a satisfactory result, without this change being visible in audit trail. this would not be visible to the data reviewer. this is a particular consideration where computerised systems alert the user to an out of specification entry before the data entry process is complete (i.e. the user saves the data entry), or saves the record in temporary memory.\n\n10. data lifecycle: what risks should be considered when retaining and retrieving data to protect it from loss or unauthorised amendment?\n\nthe following aspects should be considered when determining risk and control measures:\n\n- how / where is data stored; storage of data (paper or electronic) should be at secure locations, with access limited to authorised persons. the storage location must provide adequate protection from damage due to water, fire, etc.\n- what are the measures protecting against loss or unauthorised amendment; data security measures should be at least equivalent to those applied during the earlier data lifecycle stages. retrospective data amendment (e.g. via it helpdesk or data base amendments) should be controlled by the pharmaceutical quality system, with appropriate segregation of duties and approval processes.\n- is data backed up in a manner permitting reconstruction of the activity; back-up arrangements should be validated to demonstrate the ability to restore data following it system failure. in situations where metadata (including relevant operating system event logs) are stored in different file locations from raw data, the back-up process should be carefully designed to ensure that all data required to reconstruct a record is included.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "de675a68-a1ae-4ede-8fc4-433c43bc3949": {"__data__": {"id_": "de675a68-a1ae-4ede-8fc4-433c43bc3949", "embedding": null, "metadata": {"page_label": "9", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Data Lifecycle Management and Risk Considerations in GMP Compliance: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidelines does the EU GMP Annex 11 suggest for managing the ownership and retrieval of data, especially in scenarios involving outsourced activities or data storage?\n\n2. According to the document, what factors should be considered when determining the risk and control measures for the retirement or disposal of data at the end of its lifecycle within a GMP-compliant environment?\n\n3. How does the document propose integrating a quality-risk management approach, specifically ICH Q9, into the management of data integrity throughout its lifecycle, and what role does senior management play in this process according to EU GMP guidelines?", "prev_section_summary": "The section discusses the importance of data integrity and security throughout the data lifecycle, specifically focusing on Annex 11 of the EU GMP guide. Key topics include the visibility and handling of failed, aborted, or discrepant data activities, risks and control measures for making pass/fail decisions, and practices for storage, backup, and protection against loss or unauthorized amendment of data. Entities mentioned include data reviewers, processing information, data acceptability decisions, storage locations, data security measures, and back-up arrangements.", "excerpt_keywords": "EU GMP, Annex 11, Data Lifecycle Management, Risk Considerations, GMP Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\nsimilarly, true copies of paper records may be duplicated on paper, microfilm, or electronically, and stored in a separate location.\n\nwhat are ownership / retrieval arrangements, particularly considering outsourced activities or data storage;\n\na technical agreement should be in place which addresses the requirements of part i chapter 7 and part ii section 16 of the gmp guide.\n\n11. data lifecycle: what risks should be considered when retiring or disposal of data in a controlled manner at the end of its life?\n\nthe following aspects should be considered when determining risk and control measures:\n\n- the data retention period\nthis will be influenced by regulatory requirements and data criticality. when considering data for a single product, there may be different data retention needs for pivotal trial data and manufacturing process / analytical validation data compared to routine commercial batch data.\n- how data disposal is authorised\nany disposal of data should be approved within the quality system and be performed in accordance with a procedure to ensure compliance with the required data retention period.\n\n12. is it required by the eu gmp to implement a specific procedure for data integrity?\n\nthere is no requirement for a specific procedure, however it may be beneficial to provide a summary document which outlines the organisations total approach to data governance.\n\na compliant pharmaceutical quality system generates and assesses a significant amount of data. while all data has an overall influence on gmp compliance, different data will have different levels of impact to product quality.\n\na quality-risk management (ich q9) approach to data integrity can be achieved by considering data risk and data criticality at each stage in the data lifecycle. the effort applied to control measures should be commensurate with this data risk and criticality assessment.\n\nthe approach to risk identification, mitigation, review and communication should be iterative, and integrated into the pharmaceutical quality system. this should provide senior management supervision and permit a balance between data integrity and general gmp priorities in line with the principles of ich q9 & q10.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "687ab5b3-dae8-4f13-925b-53e6cedc4206": {"__data__": {"id_": "687ab5b3-dae8-4f13-925b-53e6cedc4206", "embedding": null, "metadata": {"page_label": "10", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Design and Control of Paper Documentation System for GMP Data Integrity: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the EU GMP Annex 11 specifically correlate the ALCOA principles with the requirements for both medicinal products and active substances used as starting materials, providing chapter and paragraph references for each principle?\n\n2. What specific chapters and paragraphs in the EU GMP guidelines (Part I and Part II) address the requirement for data to be attributable, legible, contemporaneous, original, and accurate within the context of pharmaceutical manufacturing and documentation?\n\n3. What are the recommended practices for designing and controlling paper documentation systems to prevent unauthorized recreation of GMP data, including the management of template forms, according to the EU GMP guide annexes and supplementary requirements detailed in Annex 11?", "prev_section_summary": "The section discusses the management of data ownership and retrieval, especially in outsourced activities or data storage, as per EU GMP Annex 11 guidelines. It also covers the risks and control measures to consider when retiring or disposing of data at the end of its lifecycle within a GMP-compliant environment. The document proposes integrating a quality-risk management approach, specifically ICH Q9, into data integrity management throughout its lifecycle, with a focus on data risk and criticality assessment. Senior management's role in supervising and balancing data integrity with general GMP priorities is highlighted.", "excerpt_keywords": "EU GMP, Annex 11, Data Integrity, Paper Documentation System, ALCOA Principles"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## 13. how are the data integrity expectations (alcoa) for the pharmaceutical industry prescribed in the existing eu gmp relating to active substances and dosage forms published in eudralex volume 4?\n\nthe main regulatory expectation for data integrity is to comply with the requirement of alcoa principles. the table below provides for each alcoa principle the link to eu gmp references (part i, part ii and annex 11):\n\n|alcoa principle|basic requirements for medicinal products (part i): chapter 4 (1) / chapter 6 (2)|basic requirements for active substances used as starting materials (part ii): chapter 5 (3) / chapter 6 (4)|\n|---|---|---|\n|attributable (data can be assigned to the individual performing the task)|[4.20, c & f], [4.21, c & i], [4.29, e]|[6.14], [6.18], [6.52]|\n|legible (data can be read by eye or electronically and retained in a permanent format)|[4.1], [4.2], [4.7], [4.8], [4.9], [4.10]|[5.43] [6.11], [6.14], [6.15], [6.16]|\n|contemporaneous (data is created at the time the activity is performed)|[4.8]|[6.14]|\n|original (data is in the same format as it was initially generated, or as a verified copy, which retains content and meaning)|[4.9], [4.27], [paragraph \"record\"]|[6.14], [6.15], [6.16]|\n|accurate (data is true / reflective of the activity or measurement performed)|[4.1], [6.17]|[5.40], [5.45], [6.6]|\n\n1chapter 4 (part i): documentation\n\n2chapter 6 (part i): quality control\n\n3chapter 5 (part ii): process equipment (computerized system)\n\n4chapter 6 (part ii): documentation and records\n\n## 14. how should the company design and control their paper documentation system to prevent the unauthorised re-creation of gmp data?\n\nthe template (blank) forms used for manual recordings may be created in an electronic system (word, excel, etc.). the corresponding master documents should be approved and controlled electronically or in paper versions. the following expectations should be considered for the template (blank) form:\n\n- have a unique reference number (including version number) and include reference to corresponding sop number\n- should be stored in a manner which ensures appropriate version control\n- if signed electronically, should use a secure e-signature\n\nthe distribution of template records (e.g. blank forms) should be controlled. the following expectations should be considered where appropriate, based on data risk and criticality:", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "92857046-d0f4-4b99-b507-f7c29b708f3a": {"__data__": {"id_": "92857046-d0f4-4b99-b507-f7c29b708f3a", "embedding": null, "metadata": {"page_label": "11", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "\"Ensuring Data Integrity and Traceability in Electronic Systems: Best Practices and Strategies\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended to ensure the traceability and integrity of paper-based records, such as blank forms, in a GMP environment according to the EU GMP guide annexes?\n \n2. How does the document suggest computerized systems should be designed to preserve original electronic data and maintain data integrity, as outlined in the EU GMP guide annexes?\n\n3. What is the significance of reviewing electronic data in the context of GMP-related decisions, and what potential risks does solely relying on printouts for review pose according to the insights provided in the EU GMP guide annexes?", "prev_section_summary": "This section discusses the correlation of ALCOA principles with EU GMP requirements for medicinal products and active substances, providing specific chapter and paragraph references. It also addresses the design and control of paper documentation systems to prevent unauthorized recreation of GMP data, including the management of template forms. The section emphasizes the importance of data integrity expectations in the pharmaceutical industry and outlines recommended practices for ensuring compliance with regulatory requirements.", "excerpt_keywords": "EU GMP guide, Data Integrity, Traceability, Electronic Systems, Compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\nenable traceability for issuance of the blank form by using a bound logbook with numbered pages or other appropriate system. for loose leaf template forms, the distribution date, a sequential issuing number, the number of the copies distributed, the department name where the blank forms are distributed, etc.\n\nshould be known\n\ndistributed copies should be designed to avoid photocopying either by using a secure stamp, or by the use of paper color code not available in the working areas or another appropriate system.\n\nwhat controls should be in place to ensure original electronic data is preserved?\n\ncomputerized systems should be designed in a way that ensures compliance with the principles of data integrity. the system design should make provisions such that original data cannot be deleted and for the retention of audit trails reflecting changes made to original data.\n\nwhy is it important to review electronic data?\n\nin the case of data generated from an electronic system, electronic data is the original record which must be reviewed and evaluated prior to making batch release decisions and other decisions relating to gmp related activities (e.g. approval of stability results, analytical method validation etc.). in the event that the review is based solely on printouts there is potential for records to be excluded from the review process which may contain un-investigated out of specification data or other data anomalies. the review of the raw electronic data should mitigate risk and enable detection of data deletion, amendment, duplication, reusing and fabrication which are common data integrity failures.\n\nexample of an inspection citing:\n\nraw data for hplc/gc runs which had been invalidated was stored separately to the qc raw data packages and had not been included in the review process.\n\nin the above situation, the procedure for review of chromatographic data packages did not require a review of the electronic raw data or a review of relevant audit trails associated with the analyses. this lead to the exclusion of records from the review process and to lack of visibility of changes made during the processing and reporting of the data. the company was unable to provide any explanation for the data which had been invalidated.\n\nis a risk-based review of electronic data acceptable?\n\nyes. the principles of quality risk management may be applied during the review of electronic data and review by exception is permitted, when scientifically justified. exception reporting is used commonly as a tool to focus the review of electronic data such as (but not limited to) electronic batch records. exception reporting rapidly highlights to the reviewer one of the most critical elements of batch review, i.e. the exceptions. the level of review of the full electronic batch record can vary based on the exceptions as well as the level of confidence and experience with a particular process. appropriate testing and validation must be completed for the automated system and the", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "47efd296-7683-40a4-b25a-0289c44a5574": {"__data__": {"id_": "47efd296-7683-40a4-b25a-0289c44a5574", "embedding": null, "metadata": {"page_label": "12", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity and Compliance in Outsourced GMP Activities: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific steps should a company take to ensure ongoing compliance with its data governance policy/procedures during self-inspections, especially in relation to the data lifecycle elements discussed earlier in the document?\n\n2. How should a company approach the verification of data integrity and governance systems when outsourcing GMP activities to another company, and what are the key components of a formal assessment for a contract acceptor's competency and compliance in this regard?\n\n3. What strategies should a recipient (contract giver) employ to build and maintain confidence in the validity of documents such as Certificates of Analysis (CoA) provided by a supplier (contract acceptor), especially considering the significance of reviewing summary data for outsourced activities?", "prev_section_summary": "The section discusses measures recommended to ensure traceability and integrity of paper-based records in a GMP environment, as well as how computerized systems should be designed to preserve original electronic data and maintain data integrity. It emphasizes the importance of reviewing electronic data for GMP-related decisions and highlights the risks of solely relying on printouts for review. The section also mentions the significance of quality risk management in the review of electronic data and the use of exception reporting to focus on critical elements of batch review. Key topics include traceability of blank forms, design of computerized systems for data integrity, review of electronic data, and risk-based review practices. Key entities mentioned include bound logbooks, sequential issuing numbers, secure stamps, audit trails, electronic data, batch release decisions, quality risk management, exception reporting, and validation of automated systems.", "excerpt_keywords": "Data Integrity, Compliance, Outsourced GMP Activities, Contract Acceptors, Certificate of Analysis"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\noutput batch exception report to ensure its functionality meets the business and regulatory requirements as per gmp.\n\n18. what are the expectations for the self-inspection program related to data integrity? ongoing compliance with the companys data governance policy/procedures should be reviewed during self-inspection, to ensure that they remain effective. this may also include elements of the data lifecycle discussed in q3-q9.\n\n19. what are my companys responsibilities relating to data integrity for gmp activities contracted out to another company? data integrity requirements should be incorporated into the companys contractor/vendor qualification/assurance program and associated procedures. in addition to having their own data governance systems, companies outsourcing activities should verify the adequacy of comparable systems at the contract acceptor. the contract acceptor should apply equivalent levels of control to those applied by the contract giver. formal assessment of the contract acceptors competency and compliance in this regard should be conducted in the first instance prior to the approval of a contractor, and thereafter verified on a periodic basis at an appropriate frequency based on risk.\n\n20. how can a recipient (contract giver) build confidence in the validity of documents such as certificate of analysis (coa) provided by a supplier (contract acceptor)? the recipient should have knowledge of the systems and procedures implemented at the supplier for the generation of the coa. arrangements should be in place to ensure that significant changes to systems are notified and the effectiveness of these arrangements should be subjected to periodic review. data related to activities which are outsourced are routinely provided as summary data in a report format (e.g. coa). these summary documents are reviewed on a routine basis by the contract acceptor and therefore the review of data integrity at the contract acceptor site on a regular periodic basis (e.g. during on-site audit) takes on even greater significance, in order to build and maintain confidence in the summary data provided.\n\n21. what are the expectations in relation to contract calibration service providers who conduct calibrations on-site and/or off-site? are audits of these companies premises required? using the principles of qrm to assess data criticality and risk, the company should include assessment of data governance systems implemented by the service provider when making decisions on service contracts. this may be achieved by on-site audit or desk-based assessment of information submitted by the service provider.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "de10bc30-2086-42c2-8663-02927d06037a": {"__data__": {"id_": "de10bc30-2086-42c2-8663-02927d06037a", "embedding": null, "metadata": {"page_label": "13", "file_name": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf", "file_type": "application/pdf", "file_size": 178746, "creation_date": "2024-04-07", "last_modified_date": "2024-03-28", "document_title": "Ensuring Data Integrity in the Supply Chain for Medicinal Products: Responsibilities and Actions", "questions_this_excerpt_can_answer": "1. What specific steps should a company take if an approved contractor receives a warning letter or statement of non-compliance concerning data integrity from a regulatory authority, according to the EU GMP guide annexes?\n \n2. How does the EU GMP guide annexes suggest a company should manage the risk to its products if an approved contractor is found non-compliant in terms of data integrity?\n\n3. According to the EU GMP guide annexes, who holds the final responsibility for ensuring data integrity and compliance throughout the supply chain for medicinal products, and how should responsibilities be documented between parties?", "prev_section_summary": "The section discusses the importance of ensuring data integrity and compliance in outsourced GMP activities. Key topics include self-inspection programs for data integrity, responsibilities for data integrity in contracted activities, building confidence in documents provided by suppliers, and expectations for contract calibration service providers. Entities mentioned include the company, contract acceptor, contract giver, and service providers. The section emphasizes the need for ongoing compliance with data governance policies, verification of data integrity systems, formal assessments of contract acceptors, and regular reviews of outsourced data to maintain confidence in summary documents.", "excerpt_keywords": "EU GMP guide, Data integrity, Supply chain, Medicinal products, Responsibilities"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[3] EU GMP guide annexes Supplementary requirements Annex 11 Computerised systems.pdf\n## 22. what is expected of my company in the event that one of my approved contractors is issued with a warning letter/statement of non-compliance concerning data integrity, from a regulatory authority?\n\nwhat is expected of my company in the event that one of my approved contractors (e.g. active substance manufacturer, finished product manufacturer, quality control laboratory etc.) is issued with a warning letter/statement of non-compliance concerning data integrity, from a regulatory authority?\n\nit is considered that the company should evaluate the risk to its products manufactured/released using the principles of quality risk management. risk assessments should be made available to inspectors, on request. depending on the outcome of the risk assessment, appropriate action should be taken which may entail delisting the contractor from the approved contractor list. in the event that abnormal disruption in supply may result from a contractor compliance situation, relevant regulatory authorities should be consulted in this regard.\n\n## 23. where does my companys responsibility begin and end in relation to data integrity aspects of the supply chain for medicinal products?\n\nall actors in the supply chain play an important part in overall data integrity and assurance of product quality. data governance systems should be implemented from the manufacture of starting materials right through to the delivery of medicinal products to persons authorized or entitled to supply medicinal products to the public. relative responsibilities and boundaries should be documented in the contracts between the relevant parties. final responsibility of ensuring compliance throughout the supply chain rests with batch certifying qp.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d0e5740c-4900-4891-a9cc-bb46207b9572": {"__data__": {"id_": "d0e5740c-4900-4891-a9cc-bb46207b9572", "embedding": null, "metadata": {"page_label": "1", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guideline on Computerised Systems and Electronic Data Management in Clinical Trials", "questions_this_excerpt_can_answer": "1. What specific document does the new EMA Guideline on Computerised Systems and Electronic Data Management in Clinical Trials replace, and what was its reference number?\n2. What are the key dates associated with the adoption and public consultation process of the EMA Guideline on Computerised Systems and Electronic Data in Clinical Trials, including the start and end of the public consultation period, as well as the final version adoption date?\n3. What are some of the specific areas of focus or keywords highlighted in the EMA Guideline on Computerised Systems and Electronic Data Management in Clinical Trials, particularly regarding the aspects of clinical trial data management it aims to address?", "excerpt_keywords": "computerised systems, electronic data, validation, audit trail, user management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## good clinical practice inspectors working group (gcp iwg)\n\nguideline on computerised systems and electronic data in clinical trials\n\nadopted by gcp iwg for release for consultation: 4 march 2021\n\nstart of public consultation: 18 june 2021\n\nend of consultation (deadline for comments): 17 december 2021\n\nfinal version adopted by the gcp iwg: 7 march 2023\n\ndate of coming into effect: 6 months after publication\n\nthis guideline replaces the reflection paper on expectations for electronic source data and data transcribed to electronic data collection tools in clinical trials (ema/ins/gcp/454280/2010).\n\nkeywords\ncomputerised systems, electronic data, validation, audit trail, user management, security, electronic clinical outcome assessment (ecoa), interactive response technology (irt), case report form (crf), electronic signatures, artificial intelligence (ai)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d77c7ec8-96d9-42db-b75d-2d25de2fcb54": {"__data__": {"id_": "d77c7ec8-96d9-42db-b75d-2d25de2fcb54", "embedding": null, "metadata": {"page_label": "2", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Regulatory Compliance and Data Integrity in Computerized Systems: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific principles and guidelines does the EMA document outline for ensuring data integrity within computerized systems used in clinical trials, particularly regarding the ALCOA++ principles?\n \n2. How does the document address the validation of computerized systems and the management of electronic signatures to comply with regulatory requirements in clinical trials?\n\n3. What recommendations does the EMA guideline provide for the use of cloud solutions in the storage and management of electronic data in clinical trials, including considerations for data protection and control?", "prev_section_summary": "The section discusses the new EMA Guideline on Computerised Systems and Electronic Data Management in Clinical Trials, which replaces a previous reflection paper. Key topics include the adoption and public consultation process dates, areas of focus such as validation, audit trail, user management, security, electronic clinical outcome assessment, interactive response technology, electronic signatures, and artificial intelligence. The document also mentions the Good Clinical Practice Inspectors Working Group (GCP IWG) as the entity responsible for the guideline.", "excerpt_keywords": "EMA, computerised systems, electronic data, clinical trials, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n|glossary|5|\n|---|---|\n|abbreviations|7|\n|executive summary|8|\n|1. introduction|8|\n|2. scope|9|\n|3. legal and regulatory background|10|\n|4. principles and definition of key concepts|10|\n|4.1. data integrity|10|\n|4.2. responsibilities|11|\n|4.3. data and metadata|11|\n|4.4. source data|11|\n|4.5. alcoa++ principles|12|\n|4.6. criticality and risks|13|\n|4.7. data capture|14|\n|4.8. electronic signatures|15|\n|4.9. data protection|16|\n|4.10. validation of systems|16|\n|4.11. direct access|16|\n|5. computerised systems|17|\n|5.1. description of systems|17|\n|5.2. documented procedures|17|\n|5.3. training|17|\n|5.4. security and access control|17|\n|5.5. timestamp|18|\n|6. electronic data|18|\n|6.1. data capture and location|18|\n|6.1.1. transcription|18|\n|6.1.2. transfer|18|\n|6.1.3. direct capture|19|\n|6.1.4. edit checks|19|\n|6.2. audit trail and audit trail review|19|\n|6.2.1. audit trail|19|\n|6.2.2. audit trail review|20|\n|6.3. sign-off of data|21|\n|6.4. copying data|21|\n|6.5. certified copies|22|\n|6.6. control of data|22|\n|6.7. cloud solutions|23|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "12f56622-0ed8-49d2-bab5-c99d64e16074": {"__data__": {"id_": "12f56622-0ed8-49d2-bab5-c99d64e16074", "embedding": null, "metadata": {"page_label": "3", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Management and Security Best Practices: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidelines does the EMA document provide regarding the validation and testing of computerised systems in clinical trials, including the steps for ensuring traceability of requirements and execution of test plans?\n\n2. How does the document outline the management of user access and privileges within clinical trial data systems, particularly focusing on the principles of segregation of duties, least-privilege rule, and the management of unique usernames?\n\n3. What comprehensive security measures are recommended by the document for protecting electronic data in clinical trials, including the strategies for vulnerability management, penetration testing, and the implementation of password policies and remote authentication methods?", "prev_section_summary": "The section outlines the principles and guidelines for ensuring data integrity within computerized systems used in clinical trials, focusing on the ALCOA++ principles. It addresses the validation of computerized systems and the management of electronic signatures to comply with regulatory requirements. Additionally, recommendations are provided for the use of cloud solutions in the storage and management of electronic data in clinical trials, with considerations for data protection and control. Key topics include data integrity, responsibilities, data and metadata, source data, electronic signatures, data protection, validation of systems, and the use of cloud solutions.", "excerpt_keywords": "Data Management, Security Best Practices, EMA Guideline, Clinical Trials, Electronic Data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## backup of data\n\npage: 24\n\n## contingency plans\n\npage: 24\n\n## migration of data\n\npage: 24\n\n## archiving\n\npage: 25\n\n## database decommissioning\n\npage: 25\n\n## annex 1 agreements\n\npage: 27\n\n## annex 2 computerised systems validation\n\npage: 30\n\n|a2.1 general principles|page: 30|\n|---|---|\n|a2.2 user requirements|page: 31|\n|a2.3 trial specific configuration and customisation|page: 31|\n|a2.4 traceability of requirements|page: 31|\n|a2.5 validation and test plans|page: 31|\n|a2.6 test execution and reporting|page: 32|\n|a2.7 release for production|page: 32|\n|a2.8 user helpdesk|page: 32|\n|a2.9 periodic review|page: 33|\n|a2.10 change control|page: 33|\n\n## annex 3 user management\n\npage: 34\n\n|a3.1 user management|page: 34|\n|---|---|\n|a3.2 user reviews|page: 34|\n|a3.3 segregation of duties|page: 34|\n|a3.4 least-privilege rule|page: 34|\n|a3.5 individual accounts|page: 34|\n|a3.6 unique usernames|page: 35|\n\n## annex 4 security\n\npage: 36\n\n|a4.1 ongoing security measures|page: 36|\n|---|---|\n|a4.2 physical security|page: 36|\n|a4.3 firewalls|page: 36|\n|a4.4 vulnerability management|page: 36|\n|a4.5 platform management|page: 37|\n|a4.6 bi-directional devices|page: 37|\n|a4.7 anti-virus software|page: 37|\n|a4.8 penetration testing|page: 37|\n|a4.9 intrusion detection and prevention|page: 37|\n|a4.10 internal activity monitoring|page: 37|\n|a4.11 security incident management|page: 38|\n|a4.12 authentication method|page: 38|\n|a4.13 remote authentication|page: 38|\n|a4.14 password managers|page: 38|\n|a4.15 password policies|page: 39|\n|a4.16 password confidentiality|page: 39|\n|a4.17 inactivity logout|page: 39|\n|a4.18 remote connection|page: 39|\n|a4.19 protection against unauthorised back-end changes|page: 39|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "136c588d-0495-4e31-9fa8-02cce5cfce6c": {"__data__": {"id_": "136c588d-0495-4e31-9fa8-02cce5cfce6c", "embedding": null, "metadata": {"page_label": "4", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Advancements in Clinical Trials: Utilizing Clinical Systems and Electronic Tools", "questions_this_excerpt_can_answer": "1. What are the specific considerations outlined in Annex 5 of the EMA Guideline on computerised systems and electronic data in clinical trials for utilizing electronic clinical outcome assessment tools in clinical trials?\n\n2. How does the EMA Guideline address the process of purchasing, developing, or updating computerised systems by clinical trial sites, as detailed in section A6.1 of Annex 6?\n\n3. What are the guidelines provided by the EMA for ensuring confidentiality and security in the use of clinical systems within clinical trials, as specified in sections A6.5 and A6.6 of Annex 6?", "prev_section_summary": "The section discusses key topics related to data management and security best practices in clinical trials, including backup of data, contingency plans, migration of data, archiving, database decommissioning, agreements, computerised systems validation, user management, and security measures. The entities mentioned include user requirements, trial specific configuration, traceability of requirements, validation and test plans, user helpdesk, change control, user reviews, segregation of duties, least-privilege rule, unique usernames, physical security, firewalls, vulnerability management, penetration testing, intrusion detection, authentication methods, password policies, and protection against unauthorized changes. These topics and entities are essential for ensuring the integrity and security of electronic data in clinical trials.", "excerpt_keywords": "Clinical trials, Electronic data, Computerised systems, EMA Guideline, Data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n|content|page number|\n|---|---|\n|annex 5 additional consideration to specific systems|40|\n|a5.1 electronic clinical outcome assessment|40|\n|a5.2 interactive response technology system|45|\n|a5.3 electronic informed consent|46|\n|annex 6 clinical systems|50|\n|a6.1 purchasing, developing, or updating computerised systems by sites|50|\n|a6.2 site qualification by the sponsor|50|\n|a6.3 training|50|\n|a6.4 documentation of medical oversight|50|\n|a6.5 confidentiality|51|\n|a6.6 security|51|\n|a6.7 user management|51|\n|a6.8 direct access|51|\n|a6.9 trial specific data acquisition tools|52|\n|a6.10 archiving|52|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bb99cc00-3055-40dd-92d4-ccc7ea673c81": {"__data__": {"id_": "bb99cc00-3055-40dd-92d4-ccc7ea673c81", "embedding": null, "metadata": {"page_label": "5", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Clinical Trial Guidelines: Key Terms and Best Practices", "questions_this_excerpt_can_answer": "1. How does the EMA Guideline define the term \"audit trail\" in the context of computerised systems used in clinical trials, and what is its significance for reconstructing events related to electronic records?\n\n2. What is the comprehensive definition of \"computerised system life cycle\" according to the EMA Guideline on computerised systems and electronic data in clinical trials, and what are the key phases included in this life cycle?\n\n3. In the context of the EMA Guideline, how is \"configuration\" of a computerised system distinguished from other forms of system setup, and what specific skills or knowledge does it require?", "prev_section_summary": "The section discusses specific considerations outlined in Annex 5 of the EMA Guideline on computerised systems and electronic data in clinical trials for utilizing electronic clinical outcome assessment tools, interactive response technology systems, and electronic informed consent. It also addresses guidelines for purchasing, developing, and updating computerised systems by clinical trial sites, site qualification by sponsors, training, documentation of medical oversight, confidentiality, security, user management, direct access, trial-specific data acquisition tools, and archiving within clinical systems in clinical trials.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, audit trail"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## glossary\n\ngenerally used terms\n\nunless otherwise specified (e.g. source data or source document) and in order to simplify the text, data will be used in this guideline in a broad meaning, which may include documents, records or any form of information.\n\nall references to sponsors and investigators in this guideline also apply to their service providers, irrespective of the services provided.\n\nwhen a computerised system is implemented by an institution where the investigator is conducting a clinical trial, any reference to the investigator in this guideline also includes the institution, when applicable.\n\nthe term trial participant is used in this text as a synonym for the term subject, which is defined in regulation (eu) no 536/2014 as an individual who participates in a clinical trial, either as a recipient of the imp or as a control.\n\nthe term responsible party is frequently used instead of sponsor or principal investigator. please also refer to section 4.2. and annex 1.\n\nthe term agreement is used as an overarching term for all types of documented agreements, including contracts.\n\nthe term validation encompasses aspects usually known as qualification and validation.\n\nartificial intelligence\n\nartificial intelligence (ai) covers a very broad set of algorithms, which enable computers to mimic human intelligence. it ranges from simple if-then rules and decision trees to machine learning and deep learning.\n\naudit trail\n\nin computerised systems, an audit trail is a secure, computer generated, time-stamped electronic record that allows reconstruction of the events relating to the creation, modification, or deletion of an electronic record.\n\nclinical outcome assessment\n\nclinical outcome assessment (coa) employs a tool for the reporting of outcomes by clinicians, trial site staff, observers, trial participants and their caregivers. the term coa is proposed as an umbrella term to cover measurements of signs and symptoms, events, endpoints, health-related quality of life (hrql), health status, adherence to treatment, satisfaction with treatment, etc.\n\ncomputerised system life cycle\n\nthe life cycle of a computerised system includes all phases of the system; i.e. typically 1) the concept phase where the responsible party considers to automate a process and where user requirements are collected, 2) the project phase where a service provider can be selected, a risk-assessment is made, and the system is implemented and validated, 3) the operational phase where a system is used in a regulated environment and changes are implemented in a manner that maintains data confidentiality, integrity and availability, and finally, 4) a retirement phase, which includes decisions about data retention/archiving, migration or destruction and the management of these processes.\n\nconfiguration\n\nconfiguration sets up a system using existing (out-of-the-box) functionality. it requires no programming knowledge.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a0c4883f-7f61-4da0-a93c-50fcf7696204": {"__data__": {"id_": "a0c4883f-7f61-4da0-a93c-50fcf7696204", "embedding": null, "metadata": {"page_label": "6", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Management and System Validation in Clinical Trials: Ensuring Accuracy and Compliance", "questions_this_excerpt_can_answer": "1. How does the document define the term \"patient-reported outcome\" (PRO) and what specific aspects does it propose PRO covers in the context of clinical trials?\n \n2. What criteria does the document outline for retaining an electronic copy of dynamic files in clinical trials, especially concerning the preservation of their dynamic nature and metadata?\n\n3. According to the document, what are the key components and considerations involved in the validation process of computerized systems used in clinical trials, particularly in relation to system impact on human subject protection and clinical trial result reliability?", "prev_section_summary": "The section provides definitions and explanations of key terms related to computerised systems and electronic data in clinical trials as outlined in the EMA Guideline. Key topics include the definition of data, the roles of sponsors and investigators, the use of computerised systems in clinical trials, and the importance of agreements and validation. The section also defines terms such as artificial intelligence, audit trail, clinical outcome assessment, computerised system life cycle, and configuration. These definitions help clarify the roles, processes, and terminology involved in the use of computerised systems in clinical trials.", "excerpt_keywords": "Data Management, System Validation, Clinical Trials, Electronic Data, Computerised Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## customisation\n\ncustomisation modifies and adds to existing functionality by custom coding. it requires programming knowledge.\n\n## data governance\n\nthe total of activities, processes, roles, policies, and standards used to manage and control the data during the entire data life cycle, while adhering to alcoa++ principles (see section 4.5.).\n\n## data life cycle\n\nall processes related to the creating, recording, processing, reviewing, changing, analysing, reporting, transferring, storing, migrating, archiving, retrieving, and deleting of data.\n\n## dynamic file formats\n\ndynamic files include automatic processing and/or enable an interactive relationship with the user. a certified electronic copy may be retained in electronic file formats that are different from the original record, but the equivalent dynamic nature (including metadata) of the original record should be retained.\n\n## event log\n\nan automated log of events in relation to the use of a system like system access, alerts or firing of edit checks.\n\n## patient-reported outcome\n\nany outcome reported directly by the trial participant and based on the trial participants perception of a disease and its treatment(s) is called patient-reported outcome (pro). the term pro is proposed as an umbrella term to cover both single dimension and multi-dimension measurements of symptoms, hrql, health status, adherence to treatment, satisfaction with treatment, etc. (source: chmp reflection paper on the regulatory guidance for the use of hrql measures in the evaluation of medicinal products - emea/chmp/ewp/139391/2004)\n\n## static file formats\n\nstatic files containing information or data that are fixed and allow no dynamic interaction.\n\n## validation\n\na process of establishing and documenting that the specified requirements of a computerized system can be consistently fulfilled from design until decommissioning of the system or transition to a new system. the approach to validation should be based on a risk assessment that takes into consideration the intended use of the system and the potential of the system to affect human subject protection and reliability of clinical trial results. (ich e6 r2 1.65)", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "d720fa18-c4f3-400b-b11c-e832487384d0": {"__data__": {"id_": "d720fa18-c4f3-400b-b11c-e832487384d0", "embedding": null, "metadata": {"page_label": "7", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Understanding Clinical Trial Technology and Terminology: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What does the acronym \"ALCOA++\" stand for in the context of data integrity principles within clinical trials, and how does it expand upon the original ALCOA framework?\n \n2. How does the document define the term \"eSource\" in the context of electronic data collection and management in clinical trials, distinguishing it from traditional data sources?\n\n3. What specific technologies and methodologies does the document outline for ensuring the traceability and integrity of electronic data in clinical trials, as indicated by the inclusion of terms such as \"ETMF\" (Electronic Trial Master File) and \"EDC\" (Electronic Data Collection)?", "prev_section_summary": "The section discusses key concepts related to data management and system validation in clinical trials, including customisation, data governance, data life cycle, dynamic file formats, event log, patient-reported outcome, static file formats, and validation. It defines terms such as patient-reported outcome (PRO) and outlines criteria for retaining electronic copies of dynamic files in clinical trials. The section also emphasizes the importance of validation in ensuring that computerized systems meet specified requirements and do not compromise human subject protection or the reliability of clinical trial results.", "excerpt_keywords": "Clinical trials, Electronic data, Data integrity, ESource, EDC"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n|ai|artificial intelligence|\n|---|---|\n|alcoa++|attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, available when needed and traceable|\n|byod|bring your own device|\n|coa|clinical outcome assessment|\n|ctms|clinical trial management systems|\n|dmp|data management plan|\n|dsmb|data and safety monitoring board|\n|ecoa|electronic coa|\n|ecrf|electronic case report form|\n|edc|electronic data collection|\n|ema|european medicines agency|\n|epro|electronic pro|\n|esource|electronic source|\n|etmf|electronic tmf|\n|gcp iwg|gcp inspectors working group|\n|gcp|good clinical practice|\n|gps|global positioning system|\n|hqrl|health-related quality of life|\n|html|hypertext mark-up language|\n|https|hypertext transfer protocol secure|\n|iaas|infrastructure as a service|\n|ib|investigator brochures|\n|imp|investigational medicinal product|\n|irt|interactive response technologies|\n|it|information technology|\n|js|javascript|\n|kpi|key performance indicator|\n|paas|platform as a service|\n|pc|personal computer|\n|pdf|portable document format|\n|pro|patient-reported outcome|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6e92df35-22f0-4985-8c5d-ef8578bd2403": {"__data__": {"id_": "6e92df35-22f0-4985-8c5d-ef8578bd2403", "embedding": null, "metadata": {"page_label": "8", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guidelines for Implementing Computerised Systems in Clinical Trials", "questions_this_excerpt_can_answer": "1. What are the key reasons outlined by the EMA Guideline for the increasing need to provide guidance on the use of computerised systems and electronic data collection in clinical trials?\n \n2. How does the EMA Guideline propose to ensure the quality, reliability of trial data, and the protection of trial participants' rights, dignity, safety, and wellbeing through the use of computerised systems in clinical trials?\n\n3. What specific aspects of computerised systems and electronic data management does the EMA Guideline cover in terms of validation, user management, security, and the data life cycle to address the challenges presented by the evolving complexity of clinical research data and trial types?", "prev_section_summary": "The section provides a comprehensive guide on understanding clinical trial technology and terminology. It covers key topics such as data integrity principles within clinical trials, the definition of eSource in electronic data collection, and technologies/methodologies for ensuring traceability and integrity of electronic data. The section also includes a list of acronyms and terms related to clinical trials technology, such as ALCOA++, eSource, ETMF, EDC, and others.", "excerpt_keywords": "EMA, computerised systems, electronic data, clinical trials, guidelines"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## acronyms\n\n|saas|software as a service|\n|---|---|\n|sae|serious adverse event|\n|sop|standard operating procedures|\n|susar|suspected unexpected serious adverse reactions|\n|tmf|trial master file|\n|uat|user acceptance test|\n|ups|uninterruptible power supplies|\n|urs|user requirements specification|\n|usb|universal serial bus|\n|utc|coordinated universal time|\n|vpn|virtual private network|\n\n## executive summary\n\ncomputerised systems are being increasingly used in clinical research. the complexity of such systems has evolved rapidly in the last few years from electronic case report forms (ecrf), electronic patient reported outcomes (epros) to various wearable devices used to continuously monitor trial participants for clinically relevant parameters and ultimately to the use of artificial intelligence (ai). hence, there is a need to provide guidance to all stakeholders involved in clinical trials reflective of these changes in data types and trial types on the use of computerised systems and on the collection of electronic data, as this is important to ensure the quality and reliability of trial data, as well as the rights, dignity, safety and wellbeing of the trial participants. this would ultimately contribute to a robust decision-making process based on such clinical data.\n\nthis guideline will describe some generally applicable principles and definition of key concepts. it also covers requirements and expectations for computerised systems, including validation, user management, security, and electronic data for the data life cycle. requirements and expectations are also covered related to specific types of systems, processes, and data.\n\n## 1. introduction\n\nas described above, the change in data and trial types and thereby the use of computerised systems presents new challenges. the european medicines agency (ema) reflection paper on expectations for electronic source data and data transcribed to electronic data collection tools in clinical trials started to address these when it was published in 2010. however, the development of and experience with such systems has progressed. a more up-to-date guideline is needed to replace the reflection paper.\n\nthere is no requirement or expectation that the sponsors and investigators use computerised systems to collect data; however, the use of data acquisition tools if implemented and controlled to the described standard, offers a wide variety of functions to improve data completeness, consistency and unambiguity, e.g. automatic edit checks, automated data transfers, validation checks, assisting information and workflow control.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "ee78622d-89bb-4115-83f1-b1eabfa07ae5": {"__data__": {"id_": "ee78622d-89bb-4115-83f1-b1eabfa07ae5", "embedding": null, "metadata": {"page_label": "9", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "The Comprehensive Role of Computerised Systems in Clinical Trials: A Review of Applications and Benefits", "questions_this_excerpt_can_answer": "1. What types of electronic systems and tools are identified by the EMA guideline as crucial for capturing and managing electronic clinical data in the conduct of clinical trials for investigational medicinal products (IMPs)?\n\n2. How does the EMA guideline address the use of electronic informed consent within clinical trials, and what are the considerations regarding its implementation according to national legislation?\n\n3. In what ways does the EMA guideline suggest managing and monitoring investigational medicinal products (IMPs) and clinical samples' transit and storage temperatures to ensure the reliability of trial data?", "prev_section_summary": "The section discusses the increasing use of computerised systems in clinical research, covering the evolution of data types and trial types, the need for guidance to ensure quality and reliability of trial data, and the protection of trial participants' rights and wellbeing. It outlines key principles, definitions, and requirements for computerised systems, including validation, user management, security, and electronic data management throughout the data life cycle. The section also mentions the European Medicines Agency's reflection paper on electronic source data and the need for an updated guideline to address the challenges presented by the use of computerised systems in clinical trials. Key topics include the use of data acquisition tools, functions to improve data quality, and the importance of robust decision-making based on clinical data.", "excerpt_keywords": "computerised systems, electronic data, clinical trials, investigational medicinal products, EMA guideline"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## scope\n\nthe scope of this guideline is computerised systems, (including instruments, software and as a service) used in the creation/capture of electronic clinical data and to the control of other processes with the potential to affect participant protection and reliability of trial data, in the conduct of a clinical trial of investigational medicinal products (imps). these include, but may not be limited to the following:\n\n- electronic medical records, used by the investigator to capture of all health information as per normal clinical practice.\n- tools supplied to investigators/trial participants for recording clinical data via data entry (e.g. electronic clinical outcome assessments [ecoas]).\n- - electronic trial participant data capture devices used to collect epro data, e.g. mobile devices supplied to trial participants or applications for use by the trial participant on their own device i.e. bring your own device (byod).\n- electronic devices used by clinicians to collect data e.g. mobile devices supplied to clinicians.\n\ntools supplied for the automatic capture of data for trial participants such as biometrics, e.g. wearables or sensors.\n- ecrfs (e.g. desktop or mobile device-based programs or access to web-based applications), which may contain source data directly entered, transcribed data, or data transferred from other sources, or any combination of these.\n- tools that automatically capture data related to the transit and storage temperatures for investigational medicinal product (imp) or clinical samples.\n- tools to capture, generate, handle, or store data in a clinical environment where analysis, tests, scans, imaging, evaluations, etc. involving trial participants or samples from trial participants are performed in support of clinical trials (e.g. lc-ms/ms systems, medical imaging and related software).\n- etmfs, which are used to maintain and archive the clinical trial essential documentation.\n- electronic informed consent, for the provision of information and/or capture of the informed consent when this is allowed according to national legislation, e.g. desktop or mobile device-based programs supplied to potential trial participants or applications for use by the potential trial participants on their byod or access to web-based applications.\n- interactive response technologies (irt), for the management of randomisation, supply and receipt of imp, e.g. via a web-based application.\n- portals or other systems for supplying information from the sponsor to the sites (e.g. investigator brochures (ibs), suspected unexpected serious adverse reactions (susars) or training material), from the sites to the sponsor (e.g. the documentation of the investigators review of important safety information), or from the sponsor or the site to adjudication committees and others.\n- systems/tools used to conduct remote activities such as monitoring or auditing.\n- other computerised systems implemented by the sponsor holding/managing and/or analysing or reporting data relevant to the clinical trial e.g. clinical trial management systems (ctms), pharmacovigilance databases, statistical software, document management systems, test management systems and central monitoring software.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "357521f8-eb0f-4f06-ad73-46b1840dee14": {"__data__": {"id_": "357521f8-eb0f-4f06-ad73-46b1840dee14", "embedding": null, "metadata": {"page_label": "10", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "AI and Data Integrity in Clinical Trials: Legal, Regulatory, and Ethical Principles", "questions_this_excerpt_can_answer": "1. What specific AI applications in clinical trials are mentioned in the EMA Guideline on computerised systems and electronic data, and what is the guideline's initial stance on setting additional requirements for AI beyond those applicable to all systems?\n \n2. How does the EMA Guideline on computerised systems and electronic data in clinical trials advise sponsors, investigators, and other parties to comply with EU regulations and ICH E6 Good Clinical Practice (GCP) regarding the use of computerised systems and the collection of electronic data?\n\n3. What principles does the EMA Guideline outline for achieving data integrity in clinical trials, and what are the key components of data governance as described within the guideline?", "prev_section_summary": "The section discusses the scope of the EMA guideline on computerised systems and electronic data in clinical trials, focusing on the various types of electronic systems and tools crucial for capturing and managing electronic clinical data. Key topics include electronic medical records, tools for recording clinical data, devices for data capture, tools for automatic data capture, eCRFs, tools for managing transit and storage temperatures, ETMFs, electronic informed consent, interactive response technologies, portals for information exchange, systems for remote activities, and other computerised systems used in clinical trials. The section emphasizes the importance of participant protection and data reliability in the conduct of clinical trials for investigational medicinal products.", "excerpt_keywords": "AI applications, clinical trials, EMA guideline, data integrity, electronic data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## ai used in clinical trials\n\nai used in clinical trials e.g. for trial participant recruitment, determination of eligibility, coding of events and concomitant medication, data clarification, query processes and event adjudication. requirements to ai beyond the generally applicable expectations to all systems will not be covered in this guideline initially. this may be covered in a future annex.\n\nthe approach towards computerised systems used in clinical practice (e.g. regarding validation) should be risk proportionate (please also refer to section 4.6.).\n\n## legal and regulatory background\n\n- regulation (eu) no 536/2014, or directive 2001/20/ec and directive 2005/28/ec\n- ich guideline for good clinical practice e6 r2 (ema/chmp/ich/135/1995 revision 2)\n\nthis guideline is intended to assist the sponsors, investigators, and other parties involved in clinical trials to comply with the requirements of the current legislation (regulation (eu) no 536/2014, directive 2001/20/ec and directive 2005/28/ec), as well as ich e6 good clinical practice (gcp), regarding the use of computerised systems and the collection of electronic data in clinical trials.\n\nthe risk-based approach to quality management also has an impact on the use of computerised systems and the collection of electronic data. consideration should also be given to meeting the requirements of any additional current legal and regulatory framework that may in addition apply to the medicinal product regulatory framework, depending on the digital technology. these may include e.g. medical devices, data protection legislation, and legislation on electronic identification and electronic signatures.\n\nfurther elaboration of the expectations of the eu gcp inspectors working group (gcp iwg) on various topics, including those on computerised systems, can be found as gcp iwg q&as published on the ema website.\n\n## principles and definition of key concepts\n\nthe following sections outline the basic principles that apply to all computerised systems used in clinical trials.\n\n### data integrity\n\ndata integrity is achieved when data (irrespective of media) are collected, accessed, and maintained in a secure manner, to fulfil the alcoa++ principles of being attributable, legible, contemporaneous, original, accurate, complete, consistent, enduring, available when needed and traceable as described in section 4.5. in order for the data to adequately support robust results and good decision making throughout the data life cycle. assuring data integrity requires appropriate quality and risk management systems as described in section 4.6., including adherence to sound scientific principles and good documentation practices.\n\n- data governance should address data ownership and responsibility throughout the data life cycle, and consider the design, operation, and monitoring of processes/systems to comply with the principles of data integrity including control over intentional and unintentional changes to data.\n- data governance systems should include staff training on the importance of data integrity principles and the creation of a working environment that enables visibility, and actively encourages reporting of omissions and erroneous results.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b9727fcf-83c0-41cb-904e-f3eb541f2fd4": {"__data__": {"id_": "b9727fcf-83c0-41cb-904e-f3eb541f2fd4", "embedding": null, "metadata": {"page_label": "11", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Integrity and Managing Source Data in Clinical Trials", "questions_this_excerpt_can_answer": "1. What are the two primary parties assigned the responsibility for the conduct of clinical trials as per the legislation, and how do they interact with computerised systems for data management?\n \n2. How does the document define metadata in the context of clinical trials, and what are some examples of metadata types that provide context to data points within these trials?\n\n3. What constitutes source data in clinical trials according to the document, and what are some examples of source documents that might contain this data?", "prev_section_summary": "This section discusses the use of AI in clinical trials, legal and regulatory background including EU regulations and ICH E6 Good Clinical Practice, and principles for achieving data integrity in clinical trials. Key topics include AI applications in clinical trials, compliance with regulations, risk-based approach to quality management, data integrity principles, data governance, and staff training on data integrity principles. Entities mentioned include sponsors, investigators, EU regulations, ICH guidelines, data governance systems, and staff training.", "excerpt_keywords": "Clinical trials, Data integrity, Metadata, Source data, Electronic data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nlack of integrity before the expiration of the mandated retention period may render the data unusable and is equivalent to data loss/destruction.\n\n#### responsibilities\n\nroles and responsibilities in clinical trials should be clearly defined. the responsibility for the conduct of clinical trials is assigned via legislation to two parties, which may each have implemented computerised systems for holding/managing data:\n\n- investigators and their institutions, laboratories and other technical departments or clinics, generate and store the data, construct the record, and may use their own software and hardware (purchased, part of national or institutional health information systems, or locally developed).\n- sponsors that supply, store, and/or manage and operate computerised systems (including software and hardware) and the records generated by them. sponsors may do this directly, or via service providers, including organisations providing e.g. ecoa, ecrf, or irt that collect and store data on behalf of sponsors.\n\nplease refer to annex 1 regarding the transfer/delegation to service providers of tasks related to the use of computerised systems and services.\n\n#### data and metadata\n\nelectronic data consist of individual data points. data become information when viewed in context. metadata provide context for the data point. different types of metadata exist such as: variable name, unit, field value before and after change, reason for change, trial master file (tmf) location document identifier, timestamp, user. typically, these are data that describe the characteristics, structure, data elements and inter-relationships of data e.g. audit trails. metadata also permit data to be attributable to an individual entering or taking an action on the data such as modifying, deleting, reviewing, etc. (or if automatically generated, to the original data source). metadata form an integral part of the original record. without the context provided by metadata, the data have no meaning. loss of metadata may result in a lack of data integrity and may render the data unusable.\n\n#### source data\n\nthe term source data refers to the original reported observation in a source document. source documents could be e.g. hospital records, clinical and office charts, laboratory notes. other examples are emails, spreadsheets, audio and/or video files, images, and tables in databases. the location of source documents and the associated source data they contain should be clearly identified at all points within the data capture process.\n\nbelow is an outline (figure 1) of the data processing stages, starting with the data capture. the correct identification of source data is important for adequate source data verification and archiving. data at different processing stages can be considered source depending on the preceding processing steps.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3def8f76-66c9-41a3-a089-029020d79bda": {"__data__": {"id_": "3def8f76-66c9-41a3-a089-029020d79bda", "embedding": null, "metadata": {"page_label": "12", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Processing and Retention in Clinical Trials: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What are the guidelines for processing and retaining electronic source data in clinical trials as per the EMA Guideline on computerised systems and electronic data?\n \n2. How does the document define the principles of ALCOA++ in the context of data integrity and management in clinical trials, and what specific attributes are considered universally important to data according to these principles?\n\n3. What steps are recommended for ensuring that electronic data generated or captured during clinical trials is representative of the original observation, including the validation process and the inclusion of metadata?", "prev_section_summary": "This section discusses the importance of data integrity and managing source data in clinical trials. It outlines the responsibilities of investigators and sponsors in conducting clinical trials and using computerised systems for data management. The document defines metadata in the context of clinical trials and provides examples of metadata types. It also explains what constitutes source data in clinical trials and gives examples of source documents that may contain this data. The section emphasizes the significance of metadata in providing context to data points and highlights the importance of correctly identifying and verifying source data throughout the data processing stages in clinical trials.", "excerpt_keywords": "Data Processing, Retention, Electronic Source Data, ALCOA++, Metadata"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## data\n\nthis data (could be multiple computer files) is not human readable and requires image capture software to convert to human readable image.\n\n## image\n\nimage may be very complex and can be viewed by humans using the specific software.\n\n## annotated image\n\nselected areas or parts of the image may be the region of interest and subject to analysis to generate an outcome.\n\n## reported results\n\nresults are transcribed into the crf.\n\nsponsor receives the data and undertakes data analysis for the trial.\n\n## data capture sometimes requires some degree of processing prior to data recording.\n\nin this process, the data generated during an observation, measurement or data collection is checked, processed, and transferred into a new format and then recorded.\n\nthe retention of unprocessed data records is not always feasible. if the processing is an integral part of the solution used and is recognizable as such in the solution characteristics, there is no need to extract and retain the unprocessed data. it should be possible to validate the correct operation of the processing.\n\nas a general principle, the source data should be processed as little as possible and as much as necessary.\n\nfrom a practical point of view, the first obtainable permanent data from an electronic data generation/capture should be considered and defined as the electronic source data. this process should be validated to ensure that the source data generated/captured is representative of the original observation and should contain metadata, including audit trail, to ensure adherence to the alcoa++ principles (see section 4.5.). the location where the source data is first obtained should be part of the metadata.\n\n## alcoa++ principles\n\na number of attributes are considered of universal importance to data. these include that the data are:\n\n- attributable: data should be attributable to the person and/or system generating the data. based on the criticality of the data, it should also be traceable to the system/device, in which the data were generated/captured. the information about originator (e.g. system operator, data originator) and system (e.g. device, process) should be kept as part of the metadata.\n- legible: data should be maintained in a readable form to allow review in its original context. therefore, changes to data, such as compression, encryption and coding should be completely reversible.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "28457a4f-6ee7-4c36-a0cd-c5dac8f7dfe2": {"__data__": {"id_": "28457a4f-6ee7-4c36-a0cd-c5dac8f7dfe2", "embedding": null, "metadata": {"page_label": "13", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Quality Management in Clinical Trials: Strategies and Best Practices", "questions_this_excerpt_can_answer": "1. How does the EMA guideline suggest ensuring the accuracy of data transferred between computerised systems in clinical trials, and what specific processes are recommended for validating this data transfer?\n\n2. According to the EMA guideline, what are the key characteristics that data should possess throughout its life cycle in clinical trials, and how does the guideline propose maintaining the traceability of data changes?\n\n3. What is the role of a quality management system with a risk-based approach in clinical trials as described in the EMA guideline, and how should risks be considered at both the system level and specific clinical trial level?", "prev_section_summary": "The section discusses the guidelines for processing and retaining electronic source data in clinical trials as per the EMA Guideline on computerised systems and electronic data. It defines the principles of ALCOA++ in the context of data integrity and management in clinical trials, emphasizing the importance of attributes such as data being attributable and legible. The section also outlines steps for ensuring that electronic data generated or captured during clinical trials is representative of the original observation, including validation processes and the inclusion of metadata. It highlights the importance of processing data as little as possible and as much as necessary, and the validation of the source data to ensure adherence to the ALCOA++ principles.", "excerpt_keywords": "EMA, guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## contemporaneous\n\ndata should be generated by a system or captured by a person at the time of the observation. the time point of the observation and the time point of the storage should be kept as part of the metadata, including the audit trail. accurate date and time information should be automatically captured and should be linked and set by an external standard.\n\n## original\n\ndata should be the original first generation/capture of the observation. certified copies can replace original data (see section 6.5. on certified copies). information that is originally captured in a dynamic state should remain available in that state.\n\n## accurate\n\nthe use of computerised systems should ensure that the data are at least as accurate as those recorded on paper. the coding process, which consists in matching text or data collected on the data acquisition tools to terms in a standard dictionary, thesaurus, or tables (e.g. units, scales), should be controlled. the process of data transfer between systems should be validated to ensure the data remain accurate. data should be an accurate representation of the observations made. metadata should contain information to describe the observations and, where appropriate, it could also contain information to confirm its accuracy.\n\n## complete\n\nto reconstruct and fully understand an event, data should be a complete representation of the observation made. this includes the associated metadata and audit trail and may require preserving the original context.\n\n## consistent\n\nprocesses should be in place to ensure consistency of the definition, generation/capturing and management (including migration) of data throughout the data life cycle. processes should be implemented to detect and/or avoid contradictions, e.g. by the use of standardisation, data validation and appropriate training.\n\n## enduring\n\ndata should be maintained appropriately such that they remain intact and durable through the entire data life cycle, as appropriate, according to regulatory retention requirements (see sections 6.8. and 6.10. on back-up and archiving).\n\n## available when needed\n\ndata should be stored throughout the data life cycle and should be readily available for review when needed.\n\n## traceable\n\ndata should be traceable throughout the data life cycle. any changes to the data, to the context/metadata should be traceable, should not obscure the original information and should be explained, if necessary. changes should be documented as part of the metadata (e.g. audit trail).\n\n## 4.6. criticality and risks\n\nich e6 describes the need for a quality management system with a risk-based approach. risks should be considered at both the system level e.g. standard operating procedures (sops), computerised systems and staff, and for the specific clinical trial e.g. trial specific data and data acquisition tools or trial specific configurations or customisations of systems.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bc5fd7d0-0513-4081-997c-ba1caf5bf0fa": {"__data__": {"id_": "bc5fd7d0-0513-4081-997c-ba1caf5bf0fa", "embedding": null, "metadata": {"page_label": "14", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Mitigating Risks in Computerised Systems for Clinical Trials: Strategies for Ensuring Data Integrity and Regulatory Compliance", "questions_this_excerpt_can_answer": "1. What specific strategies are recommended for mitigating risks associated with the use of computerised systems in clinical trials, particularly those risks that could impact the rights, safety, and well-being of trial participants or the reliability of trial results?\n \n2. How does the document suggest handling the complexity and interdependency of computerised systems or their components, especially in the context of ensuring data integrity and assessing risk levels in clinical trials?\n\n3. What are the guidelines for ensuring that data collected or generated in clinical trials adhere to ALCOA++ principles, and what measures should be taken to manage data integrity risks across the data lifecycle, from generation to destruction?", "prev_section_summary": "The section discusses the key characteristics that data should possess throughout its life cycle in clinical trials, including being contemporaneous, original, accurate, complete, consistent, enduring, available when needed, and traceable. It emphasizes the importance of generating or capturing data at the time of observation, maintaining the original state of data, ensuring accuracy through validation processes, and preserving data integrity and traceability. Additionally, it highlights the role of a quality management system with a risk-based approach in considering risks at both the system and specific clinical trial levels.", "excerpt_keywords": "Clinical trials, Computerised systems, Data integrity, Regulatory compliance, Risk mitigation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nrisks in relation to the use of computerised systems and especially critical risks affecting the rights, safety and well-being of the trial participants or the reliability of the trial results would be those related to the assurance of data integrity. those risks should be identified, analysed, and mitigated or accepted, where justified, throughout the life cycle of the system. where applicable, mitigating actions include revised system design, configuration or customisation, increased system validation or revised sops (including appropriate training) for the use of systems and data governance culture.\n\nin general, risks should be determined based on the system used, its complexity, operator, use of system and data involved. critical component parts of any system should always be addressed. for example, a component part of an irt system that calculates imp dose based on data input by the investigator would be high risk compared to other functionalities such as the generation of an imp shipment report. the interface and interdependency between systems or system components should be taken into consideration.\n\nall data collected or generated in the context of a clinical trial should fulfil alcoa++ principles. consequently, the arrangements for data governance to ensure that data, irrespective of the format in which they are generated, recorded, processed (including analysis, alteration/imputation, transformation, or migration), used, retained (archived), retrieved and destroyed should be considered for data integrity risks and appropriate control processes implemented.\n\nthe approach used to reduce risks to an acceptable level should be proportionate to the significance of the risk. risk reduction activities may be incorporated in protocol design and implementation, system design, coding and validation, monitoring plans, agreements between parties that define roles and responsibilities, systematic safeguards to ensure adherence to sops, training in processes and procedures, etc.\n\nthere are special risks to take into consideration when activities are transferred/delegated. these are further elaborated on in annex 1 on agreements.\n\nthe risk-assessment should take the relevance of the system use for the safety, rights, dignity and well-being of the participant and the importance and integrity of derived clinical trial data into account i.e. whether the system is used for standard care and safety measurements for participants or if systems are used to generate primary efficacy data that are relied on in e.g. a marketing authorisation application. systems used for other purposes than what they were developed for, or which are used outside the suppliers specification/validation are inherently higher risk. in case of well-established computerised systems, which are used as intended in a routine setting for less critical trial data, the certification by a notified body may suffice as documentation whereas other more critical systems may require a more in-depth validation effort. this decision should be justified prior to use in the trial.\n\nfor systems deployed by the investigator/institution specifically for the purposes of clinical trials, the investigator should ensure that the requirements for computerised systems as described in this guideline are addressed and proportionately implemented. for systems deployed by the investigator/institution, the sponsor should determine during site selection whether such systems (e.g. electronic medical records and other record keeping systems for source data collection and the investigator site file) are fit for purpose.\n\nfor computerised systems deployed by the sponsor, the sponsor should ensure that the requirements of this guideline are addressed and proportionately implemented.\n\ndata capture\n\nthe clinical trial protocol should specify data to be collected and the processes to capture them, including by whom, when and by which tools.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "88d737af-4fd7-4ba2-be27-d4920dcec747": {"__data__": {"id_": "88d737af-4fd7-4ba2-be27-d4920dcec747", "embedding": null, "metadata": {"page_label": "15", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Electronic Data Management and Electronic Signatures in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific functionalities must an electronic signature system include to meet the requirements outlined in the EMA Guideline on computerised systems and electronic data in clinical trials, particularly regarding authentication, non-repudiation, unbreakable link, and timestamping?\n\n2. How does the EMA Guideline differentiate between the use of electronic signatures in closed and open systems within clinical trials, and what specific regulation is mentioned as not applicable for closed systems?\n\n3. According to the EMA Guideline on computerised systems and electronic data in clinical trials, what measures should be taken to ensure the traceability of data transformations and derivations during the processing and analysis stages of electronic data management in clinical trials?", "prev_section_summary": "The section discusses strategies for mitigating risks associated with the use of computerised systems in clinical trials, particularly focusing on data integrity and regulatory compliance. Key topics include identifying and mitigating critical risks, ensuring data collected adheres to ALCOA++ principles, handling system complexity and interdependency, and implementing appropriate control processes for data integrity risks. Entities mentioned include trial participants, system components, data governance, protocols, agreements, investigators, sponsors, and data capture processes.", "excerpt_keywords": "Electronic Data Management, Electronic Signatures, Clinical Trials, EMA Guideline, Data Integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\ndata acquisition tools should be designed and/or configured or customised to capture all information required by the protocol and not more. data fields should not be prepopulated or automatically filled in, unless these fields are not editable and are derived from already entered data (e.g. body surface area). the protocol should identify any data to be recorded directly in the data acquisition tools and identify them as source data.\n\na detailed diagram and description of the transmission of electronic data (data flow) should be available in the protocol or a protocol-related document. the sponsor should describe which data will be transferred and in what format, the origin and destination of the data, the parties with access to the transferred data, the timing of the transfer and any actions that may be applied to the data, for example, data validation, reconciliation, verification, and review. the use of a data management plan (dmp) is encouraged.\n\nthe sponsor should ensure the traceability of data transformations and derivations during data processing and analysis.\n\n## electronic signatures\n\nwhenever ich e6 requires a document to be signed and an electronic signature is used for that purpose, the electronic signature functionality should meet the expectations stated below regarding authentication, non-repudiation, unbreakable link, and timestamp of the signature.\n\nthe system should thus include functionality to:\n\n- authenticate the signatory, i.e. establish a high degree of certainty that a record was signed by the claimed signatory;\n- ensure non-repudiation, i.e. that the signatory cannot later deny having signed the record;\n- ensure an unbreakable link between the electronic record and its signature, i.e. that the contents of a signed (approved) version of a record cannot later be changed by anyone without the signature being rendered visibly invalid;\n- provide a timestamp, i.e. that the date, time, and time zone when the signature was applied is recorded.\n\nelectronic signatures can further be divided into two groups depending on whether the identity of the signatory is known in advance, i.e. signatures executed in closed and in open systems.\n\nfor closed systems, which constitute the majority of systems used in clinical trials and which are typically provided by the responsible party or by their respective service provider, the system owner knows the identity of all users and signatories and grants and controls their access rights to the system. regulation (eu) no 910/2014 (eidas) on electronic identification and trust services for electronic transactions is not applicable for closed systems (eidas article 2.2). the electronic signature functionality in these systems should be proven during system validation to meet the expectations mentioned above.\n\nfor open systems, the signatories (and users) are not known in advance. for sites located in the eu, electronic signatures should meet the requirements defined in the eidas regulation. sites located in third countries should use electronic or digital signature solutions compliant with local regulations and proven to meet the expectations mentioned above.\n\nirrespective of the media used, in case a signature is applied on a different document or only on part of a document (e.g. signature page), there should still be an unbreakable link between the electronic document to be signed and the document containing the signature.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6b44fc9f-a7a9-4ea8-a48b-fd72b9174395": {"__data__": {"id_": "6b44fc9f-a7a9-4ea8-a48b-fd72b9174395", "embedding": null, "metadata": {"page_label": "16", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Protection, System Validation, and Direct Access in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How does the document address the balance between GDPR compliance and the necessity to retain clinical trial data that could potentially identify participants, especially in the context of the \"right to be forgotten\"?\n \n2. What specific guidelines does the document provide for the transfer of personal data of trial participants from an EU member state to a third country or international organization, in terms of compliance with EU data protection legislation?\n\n3. What are the document's recommendations for maintaining the validated state of computerised systems used in clinical trials, including the validation of trial-specific configurations or customisations?", "prev_section_summary": "This section discusses the requirements and guidelines for electronic data management and electronic signatures in clinical trials as outlined in the EMA Guideline on computerised systems and electronic data. Key topics include data acquisition tools, data flow, data management plans, and electronic signatures. The section emphasizes the importance of authentication, non-repudiation, unbreakable link, and timestamping in electronic signatures, and differentiates between closed and open systems in clinical trials. It also mentions the regulation (EU) No 910/2014 (eIDAS) on electronic identification and trust services for electronic transactions and the traceability of data transformations and derivations during data processing and analysis.", "excerpt_keywords": "Data Protection, System Validation, Clinical Trials, GDPR Compliance, Electronic Data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## 4.9. data protection\n\nthe confidentiality of data that could identify trial participants should be protected, respecting privacy and confidentiality rules in accordance with the applicable regulatory requirement(s). the requirements of general data protection regulation (eu) no 2016/679 (gdpr) on the protection of individuals with regard to the processing of personal data and on the free movement of such data should be followed except when specific requirements are implemented for clinical trials e.g. that a trial participant does not have the right to be forgotten (and for the data to be consequently deleted) as this would cause bias to e.g. safety data (regulation (eu) no 536/2014 recital 76 and article 28(3)). trial participants should be informed accordingly.\n\nin accordance with eu data protection legislation, if personal data of trial participants from an eu member state are processed (at rest or in transit) or transferred to a third country or international organisation, such data transfer must comply with applicable union data protection. in summary, this means that the transfer must be either carried out on the basis of an adequacy decision (article 45 of gdpr, article 47 of regulation (eu) no 2018/1727 - eudpr), otherwise the transfer must be subject to appropriate safeguards (as listed in article 46 of gdpr or article 48 of eudpr) or the transfer may take place only if a derogation for specific situations apply (under article 49 of gdpr or article 50 of eudpr).\n\n## 4.10. validation of systems\n\ncomputerised systems used within a clinical trial should be subject to processes that confirm that the specified requirements of a computerised system are consistently fulfilled, and that the system is fit for purpose. validation should ensure accuracy, reliability, and consistent intended performance, from the design until the decommissioning of the system or transition to a new system.\n\nthe processes used for the validation should be decided upon by the system owner (e.g. sponsors, investigators, technical facilities) and described, as applicable. system owners should ensure adequate oversight of validation activities (and associated records) performed by service providers to ensure suitable procedures are in place and that they are being adhered to.\n\ndocumentation (including information within computerised systems used as process tools for validation activities) should be maintained to demonstrate that the system is maintained in the validated state. such documentation should be available for both the validation of the computerised system and for the validation of the trial specific configuration or customisation.\n\nvalidation of the trial specific configuration or customisation should ensure that the system is consistent with the requirements of the approved clinical trial protocol and that robust testing of functionality implementing such requirements is undertaken, for example, eligibility criteria questions in an ecrf, randomisation strata and dose calculations in an irt system. see annex 2 for further detail on validation.\n\n## 4.11. direct access\n\nall relevant computerised systems should be readily available with full, direct and read-only access (this requires a unique identification method e.g. username and password) upon request by inspectors from regulatory authorities. if a computerised system is decommissioned, direct access (with a unique identification method) to the data in a timely manner should still be ensured (see section 6.12.).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "149e8c2b-c1b2-48fb-9a1d-fc5db1cb3157": {"__data__": {"id_": "149e8c2b-c1b2-48fb-9a1d-fc5db1cb3157", "embedding": null, "metadata": {"page_label": "17", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Computerised Systems in Clinical Trials: Requirements, Procedures, Training, and Security", "questions_this_excerpt_can_answer": "1. What specific sections of the EMA Guideline on computerised systems and electronic data in clinical trials detail the requirements for validation, user management, and IT security?\n \n2. How does the guideline recommend handling the training of individuals involved in conducting a clinical trial, especially in relation to computerised systems, and what is the protocol for documenting such training?\n\n3. What measures does the guideline suggest for maintaining data integrity and protecting the rights of trial participants in terms of security and access control for computerised systems used in clinical trials?", "prev_section_summary": "This section discusses data protection, system validation, and direct access in clinical trials. Key topics include the protection of data that could identify trial participants, compliance with EU data protection legislation when transferring personal data, and the validation of computerised systems used in clinical trials. Entities mentioned include trial participants, system owners (such as sponsors and investigators), service providers, and regulatory authorities. The section emphasizes the importance of maintaining the validated state of computerised systems and ensuring direct access to relevant systems for regulatory inspections.", "excerpt_keywords": "Computerised systems, Electronic data, Clinical trials, Validation, Data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## 5. computerised systems\n\nrequirements for validation are described in section 4.10. and annex 2, the requirements for user management are described in annex 3, while the requirements for information technology (it) security are detailed in annex 4 of this guideline.\n\n### 5.1. description of systems\n\nthe responsible party should maintain a list of physical and logical locations of the data e.g. servers, functionality and operational responsibility for computerised systems and databases used in a clinical trial together with an assessment of their fitness for purpose.\n\nwhere multiple computerised systems/databases are used, a clear overview should be available so the extent of computerisation can be understood. system interfaces should be described, defining how the systems interact, including validation status, methods used, and security measures implemented.\n\n### 5.2. documented procedures\n\ndocumented procedures should be in place to ensure that computerised systems are used correctly. these procedures should be controlled and maintained by the responsible party.\n\n### 5.3. training\n\neach individual involved in conducting a clinical trial should be qualified by education, training, and experience to perform their respective task(s). this also applies to training on computerised systems. systems and training should be designed to meet the specific needs of the system users (e.g. sponsor, investigator or service provider). special consideration should be given to the training of trial participants when they are users.\n\nthere should be training on the relevant aspects of the legislation and guidelines for those involved in developing, coding, building, and managing trial specific computerised systems, for example, those employed at a service provider supplying ecrf, irt, epro, trial specific configuration, customisation, and management of the system during the conduct of the clinical trial. all training should be documented, and the records retained and available for monitoring, auditing, and inspections.\n\n### 5.4. security and access control\n\nto maintain data integrity and the protection of the rights of trial participants, computerised systems used in clinical trials should have security processes and features to prevent unauthorised access and unwarranted data changes and should maintain blinding of the treatment allocation where applicable. checks should be used to ensure that only authorised individuals have access to the system and that they are granted appropriate permissions (e.g. ability to enter or make changes to data). records of authorisation of access to the systems, with the respective levels of access clearly documented, should be maintained. the system should record changes to user roles and thereby access rights and permissions.\n\nthere should be documented training on the importance of security e.g. the need to protect passwords and to keep them confidential, enforcement of security systems and processes, identification and handling of security incidents, social engineering and the prevention of phishing.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "b706000e-4708-4f0d-8619-f1e940fde5af": {"__data__": {"id_": "b706000e-4708-4f0d-8619-f1e940fde5af", "embedding": null, "metadata": {"page_label": "18", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Electronic Data Management and Security in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific guidelines does the EMA document provide regarding the management of timestamps in electronic data collection systems used in clinical trials, particularly concerning the modification of date, time, and time zone settings by users?\n\n2. How does the document address the process of data transcription from paper sources into electronic data collection (EDC) systems or databases in clinical trials, including the measures to ensure the quality of transcribed data?\n\n3. What are the validation requirements and security measures outlined in the document for the transfer of trial data between systems, especially in the context of protecting data integrity and confidentiality during transfers over open networks?", "prev_section_summary": "The section discusses the requirements for validation, user management, and IT security in computerised systems used in clinical trials. It emphasizes the need for maintaining a list of physical and logical locations of data, documenting procedures for correct system usage, providing training for individuals involved in the trial, and implementing security measures to protect data integrity and trial participant rights. The section also highlights the importance of system interfaces, training documentation, and access control measures to prevent unauthorized access and data changes.", "excerpt_keywords": "Electronic Data Management, Clinical Trials, EMA Guideline, Data Integrity, Timestamp Management"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## see annexes 3 and 4 for further guidance on user management and it security.\n\n### 5.5. timestamp\n\naccurate and unambiguous date and time information given in coordinated universal time (utc) or time and time zone (set by an external standard) should be automatically captured.\n\nusers should not be able to modify the date, time and time zone on the device used for data entry, when this information is captured by the computerised system and used as a timestamp.\n\n### 6. electronic data\n\nfor each trial, it should be identified what electronic data and records will be collected, modified, imported and exported, archived and how they will be retrieved and transmitted. electronic source data, including the audit trail should be directly accessible by investigators, monitors, auditors, and inspectors without compromising the confidentiality of participants identities.\n\n#### 6.1. data capture and location\n\nthe primary goal of data capture is to collect all data required by the protocol. all pertinent observations should be documented in a timely manner. the location of all source data should be specified prior to the start of the trial and updated during the conduct of the trial where applicable.\n\n##### 6.1.1. transcription\n\nsource data collected on paper (e.g. worksheets, paper crfs or paper diaries or questionnaires) need to be transcribed either manually or by a validated entry tool into the electronic data collection (edc) system or database(s). in case of manual transcription, risk-based methods should be implemented to ensure the quality of the transcribed data (e.g. double data entry and/or data monitoring).\n\n##### 6.1.2. transfer\n\ntrial data are transferred in and between systems on a regular basis. the process for file and data transfer needs to be validated and should ensure that data and file integrity are assured for all transfers. data that is collected from external sources and transferred in open networks should be protected from unwarranted changes and secured/encrypted in a way that precludes disclosure of confidential information.\n\nall transfers that are needed during the conduct of a clinical trial need to be pre-specified. validation of transfer should include appropriate challenging test sets and ensure that the process is available and functioning at clinical trial start (e.g. to enable ongoing sponsor review of diary data, lab data or adverse events by safety committees). data transcribed or extracted and transferred from electronic sources and their associated audit trails should be continuously accessible (according to delegated roles and corresponding access rights).\n\ntransfer of source data and records when the original data or file are not maintained is a critical process and appropriate considerations are expected in order to prevent loss of data and metadata.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "475397c1-4c08-4bd1-b910-98352092f09d": {"__data__": {"id_": "475397c1-4c08-4bd1-b910-98352092f09d", "embedding": null, "metadata": {"page_label": "19", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Capture, Edit Checks, and Audit Trails in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific types of electronic data input devices and applications are mentioned as being used for direct data capture in clinical trials, according to the EMA Guideline document?\n \n2. How does the EMA Guideline document suggest handling the situation when edit checks on data inputs are paused during a clinical trial?\n\n3. What are the detailed requirements for an audit trail in computerised systems as outlined in the EMA Guideline document, particularly regarding its security, visibility, and the information it must contain?", "prev_section_summary": "The section discusses guidelines for electronic data management and security in clinical trials, focusing on timestamps, electronic data capture, data transcription, and data transfer between systems. Key topics include the importance of accurate timestamp information, the process of capturing and documenting electronic data, the transcription of paper-based data into electronic systems, and the validation and security measures required for data transfers to protect integrity and confidentiality. Entities mentioned include users, investigators, monitors, auditors, inspectors, source data, electronic data collection systems, and data transfer processes.", "excerpt_keywords": "Clinical Trials, Electronic Data Capture, Audit Trails, Edit Checks, Data Integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## 6.1.3. direct capture\n\ndirect data capture can be done by using electronic data input devices and applications such as electronic diaries, electronic questionnaires and ecrfs for direct data entry. where treatment-related pertinent information is captured first in a direct data capture tool such as a trial participant diary, a pro form or a special questionnaire, a documented procedure should exist to transfer or transcribe information into the medical record, when relevant.\n\ndirect data capture can also be done by automated devices such as wearables or laboratory or other technical equipment (e.g. medical imaging, electrocardiography equipment) that are directly linked to a data acquisition tool. such data should be accompanied by metadata concerning the device used (e.g. device version, device identifiers, firmware version, last calibration, data originator, timestamp of events).\n\n## 6.1.4. edit checks\n\ncomputerised systems should validate manual and automatic data inputs to ensure a predefined set of validation criteria is adhered to. edit checks should be relevant to the protocol and developed and revised as needed. edit checks should be validated and implementation of the individual edit checks should be controlled and documented. if edit checks are paused at any time during the trial, this should be documented and justified. edit checks could either be run immediately at data entry or automatically during defined intervals (e.g. daily) or manually.\n\nsuch approaches should be guided by necessity, should not cause bias and should be traceable e.g. when data are changed as a result of an edit check notification.\n\nthe sponsor should not make automatic or manual changes to data entered by the investigator or trial participants unless authorised by the investigator.\n\n## 6.2. audit trail and audit trail review\n\n### 6.2.1. audit trail\n\nan audit trail should be enabled for the original creation and subsequent modification of all electronic data. in computerised systems, the audit trail should be secure, computer generated and timestamped. an audit trail is essential to ensure that changes to the data are traceable. audit trails should be robust, and it should not be possible for normal users to deactivate them. if possible, for an audit trail to be deactivated by admin users, this should automatically create an entry into a log file (e.g. audit trail).\n\nentries in the audit trail should be protected against change, deletion, and access modification (e.g. edit rights, visibility rights). the audit trail should be stored within the system itself. the responsible investigator, sponsor, and inspector should be able to review and comprehend the audit trail and therefore audit trails should be in a human-readable format.\n\naudit trails should be visible at data-point level in the live system, and it should be possible to export the entire audit trail as a dynamic data file to allow for the identification of systematic patterns or concerns in data across trial participants, sites, etc. the audit trail should show the initial entry and the changes (value - previous and current) specifying what was changed (field, data identifiers) by whom (username, role, organisation), when (date/timestamp) and, where applicable, why (reason for change).\n\na procedure should be in place to address the situation when a data originator (e.g. investigator or trial participant) realises that she/he has submitted incorrect data by mistake and wants to correct the recorded data.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2888e4b8-38b8-4093-ac3f-d8ea476d379f": {"__data__": {"id_": "2888e4b8-38b8-4093-ac3f-d8ea476d379f", "embedding": null, "metadata": {"page_label": "20", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Enhancing Data Integrity and Traceability: A Guide to Audit Trails and Metadata Review\"", "questions_this_excerpt_can_answer": "1. What specific types of electronic systems are mentioned as potentially not uploading data immediately and how should changes to data stored in local memory be handled before saving, according to the EMA Guideline on computerised systems and electronic data in clinical trials?\n\n2. How does the guideline suggest handling audit trails for data extracts or database extracts used for internal reporting and statistical analysis, and what does it say about capturing the generation of these extracts?\n\n3. What are the recommended procedures for audit trail review in clinical trials as per the guideline, and what specific aspects should this review focus on to ensure data integrity and traceability?", "prev_section_summary": "This section discusses direct data capture in clinical trials using electronic devices and applications, the validation of data inputs through edit checks, and the requirements for an audit trail in computerized systems. Key topics include the types of electronic data input devices used, handling of paused edit checks, and the security and visibility of audit trails. Entities mentioned include electronic diaries, questionnaires, eCRFs, wearables, metadata, validation criteria, audit trail entries, and data originators.", "excerpt_keywords": "EMA, Guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nit is important that original electronic entries are visible or accessible (e.g. in the audit trail) to ensure the changes are traceable. the audit trail should record all changes made as a result of data queries or a clarification process. the clarification process for data entered should be described and documented. changes to data should only be performed when justified. justification should be documented. in case the data originator is the trial participant, special considerations to data clarifications might be warranted. see annex 5 section a5.1.1.4 for further details.\n\nfor certain types of systems (e.g. epro) the data entered may not be uploaded immediately but may be temporarily stored in local memory. such data should not be edited or changed without the knowledge of the data originator prior to saving. any changes or edits should be acknowledged by the data originator, should be documented in an audit trail and should be part of validation procedures. the timestamp of data entry in the capture tool (e.g. ecrf) and timestamp of data saved to a hard drive should be recorded as part of the metadata. the duration between initial capture in local memory and upload to a central server should be short and traceable (i.e. transaction time), especially in case of direct data entry.\n\ndata extracts or database extracts for internal reporting and statistical analysis do not necessarily need to contain the audit trail information. however, the database audit trail should capture the generation of data extracts and exports.\n\naudit trails should capture any changes in data entry per field and not per page (e.g. ecrf page). in addition to the audit trail, metadata could also include (among others) review of access logs, event logs, queries etc.\n\naccess logs, including username and user role, are in some cases considered to be important metadata and should consequently be available. this is considered necessary e.g. for systems that contain critical unblinded data. care should be taken to ensure that information jeopardising the blinding does not appear in the audit trail accessible to blinded users.\n\naudit trail review\n\nprocedures for risk-based trial specific audit trail reviews should be in place and performance of data review should be generally documented. data review should focus on critical data. data review should be proactive and ongoing review is expected unless justified. manual review as well as review by the use of technologies to facilitate the review of larger datasets should be considered. data review can be used to (among others) identify missing data, detect signs of data manipulation, identify abnormal data/outliers and data entered at unexpected or inconsistent hours and dates (individual data points, trial participants, sites), identify incorrect processing of data (e.g. non-automatic calculations), detect unauthorised accesses, detect device or system malfunction and to detect if additional training is needed for trial participants /site staff etc. audit trail review can also be used to detect situations where direct data capture has been defined in the protocol but where this is not taking place as described.\n\nin addition to audit trail review, metadata review could also include (among others) review of access logs, event logs, queries, etc.\n\nthe investigator should receive an introduction on how to navigate the audit trail of their own data in order to be able to review changes.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1ad4381f-43ef-4452-88e7-f4af066283d7": {"__data__": {"id_": "1ad4381f-43ef-4452-88e7-f4af066283d7", "embedding": null, "metadata": {"page_label": "21", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Comprehensive Guide to Data Management and Oversight in Clinical Trials", "questions_this_excerpt_can_answer": "1. What are the responsibilities of investigators regarding data entered into electronic case report forms (eCRFs) and other data acquisition tools in clinical trials, as outlined in the EMA Guideline on computerised systems and electronic data?\n \n2. How does the EMA Guideline suggest sponsors should determine the timing and frequency of investigator sign-off on data in clinical trials, and what factors should be considered in a risk-based approach to this process?\n\n3. What are the guidelines for copying or transcribing data in clinical trials, according to the EMA Guideline on computerised systems and electronic data, and under what conditions should a copy be certified?", "prev_section_summary": "The section discusses the importance of maintaining data integrity and traceability in electronic systems used in clinical trials. It covers topics such as handling changes to data, audit trails for data extracts, procedures for audit trail review, and the role of metadata in ensuring data quality. Key entities mentioned include original electronic entries, audit trails, data originators, data extracts, metadata, access logs, event logs, and the investigator. The section emphasizes the need for documenting changes, justifying data edits, and conducting proactive data reviews to identify issues such as missing data, data manipulation, and unauthorized accesses.", "excerpt_keywords": "Clinical trials, Data management, Electronic data, Investigator responsibilities, EMA Guideline"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n### 6.3. sign-off of data\n\nthe investigators are responsible for data entered into ecrfs and other data acquisition tools under their supervision (electronic records).\n\nthe sponsor should seek investigator endorsement of their data at predetermined milestones. the signature of the investigator or authorised member of the investigators staff is considered as the documented confirmation that the data entered by the investigator and submitted to the sponsor are attributable, legible, original, accurate, and complete and contemporaneous. any member of the staff authorised for sign-off should be qualified to do so in order to fulfil the purpose of the review as described below. national law could require specific responsibilities, which should then be followed.\n\nthe acceptable timing and frequency for the sign-off needs to be defined and justified for each trial by the sponsor and should be determined by the sponsor in a risk-based manner. the sponsor should consider trial specific risks and provide a rationale for the risk-based approach. points of consideration are types of data entered, non-routine data, importance of data, data for analysis, length of the trial and the decision made by the sponsor based on the entered data, including the timing of such decisions. it is essential that data are confirmed prior to interim analysis and the final analysis, and that important data related to e.g. reporting of serious adverse events (saes), adjudication of important events and endpoint data, data and safety monitoring board (dsmb) review, are signed off in a timely manner. in addition, a timely review and sign-off of data that are entered directly into the ecrf as source is particularly important.\n\ntherefore, it will rarely be sufficient to just provide one signature immediately prior to database lock. signing of batches of workbooks is also not suited to ensure high data quality and undermines the purpose of timely and thorough data review.\n\nfor planned interim analysis, e.g. when filing for a marketing authorisation application, all submitted data need to be signed off by the investigator or their designated and qualified representative before extracting data for analysis. the systems should be designed to support this functionality.\n\nto facilitate timely data review and signing by the investigator or their designated representative, the design of the data acquisition tool should be laid out to support the signing of the data at the defined time points.\n\nfurthermore, it is important that the investigator review the data on an ongoing basis in order to detect shortcomings and deficiencies in the trial conduct at an early stage, which is the precondition to undertake appropriate corrective and preventive actions.\n\nadequate oversight by the investigator is a general requirement to ensure participant safety as well as data quality and integrity. oversight can be demonstrated by various means, one of them being the review of reported data. lack of investigator oversight may prevent incorrect data from being corrected in a timely manner and necessary corrective and preventive actions being implemented at the investigator site.\n\n### 6.4. copying data\n\ndata can be copied or transcribed for different purposes, either to replace source documents or essential documents or to be distributed amongst different stakeholders as working copies. if essential documents or source documents are irreversibly replaced by a copy, the copy should be certified (see section 6.5.).\n\ncopies should contain a faithful representation of the data and the contextual information. source documents and data should allow accurate copies to be made. the method of copying should be practical and should ensure that the resulting copy is complete and accurate. it should include the relevant", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7322738e-0bef-4377-9dce-6b1e3955ad1f": {"__data__": {"id_": "7322738e-0bef-4377-9dce-6b1e3955ad1f", "embedding": null, "metadata": {"page_label": "22", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Certified Copies and Data Control in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific considerations should be taken into account when creating a certified copy of dynamic files or files resulting from an interpreter, according to the EMA Guideline on computerised systems and electronic data in clinical trials?\n\n2. How does the guideline address the control of data generated at clinical trial sites, especially regarding the investigator's access to trial participant data and the sponsor's control over the data?\n\n3. What alternatives does the guideline suggest if a clinical trial's data storage does not allow the investigator to hold an independent copy of the data, and how can these alternatives ensure data verifiability and security?", "prev_section_summary": "This section discusses the responsibilities of investigators regarding data entered into electronic case report forms (eCRFs) and other data acquisition tools in clinical trials, as outlined in the EMA Guideline on computerised systems and electronic data. It emphasizes the importance of investigator sign-off on data at predetermined milestones, the timing and frequency of sign-off determined by sponsors in a risk-based manner, and the guidelines for copying or transcribing data in clinical trials. Key topics include the responsibilities of investigators, the importance of timely data review and sign-off, the need for ongoing oversight by investigators, and the guidelines for copying data in clinical trials. Key entities mentioned include investigators, sponsors, data acquisition tools, source documents, and essential documents.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, certified copies"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## metadata and such metadata should be complete and accurate. see also section 5 of the guideline on the content, management and archiving of the clinical trial master file (paper and/or electronic) (ema/ins/gcp/856758/2018), for further details on definition.\n\n6.5. certified copies\n\nwhen creating a certified copy, the nature of the original document needs to be considered. for example, the content of the file is either static (e.g. a pdf document) or dynamic (e.g. a worksheet with automatic calculations) or the copy tries to capture the result of an interpreter (e.g. a web page, where a web-browser interprets written hypertext mark-up language (html), javascript (js) among other programming languages). either way, the result of the copy process should be verified either automatically by a validated process or manually to ensure that the same information is present -- including data that describe the context, content, and structure -- as in the original.\n\nin case of dynamic files e.g. when a database is decommissioned and copies of data and metadata are provided to sponsors, the resulting file should also capture the dynamic aspects of the original file. in case of files, which are the result of an interpreter, special care needs to be taken to not only consider the informative content of such a file, but also to capture and preserve aspects that are the result of the interactions of the used interpreter(s) and system settings during the display. for example, window size, browser type, operating system employed and the availability of software dependencies (e.g. enabled active web content) can influence the structure and content displayed. special considerations should be taken whenever copies are to replace original source documents.\n\n6.6. control of data\n\ndata generated at the clinical trial site relating to the trial participants should be available to the investigator at all times during and after the trial to enable investigators to make decisions related to eligibility, treatment, care for the participants, etc. and to ensure that the investigator can fulfil their legal responsibility to retain an independent copy of the data for the required retention period. this includes data from external sources, such as central laboratory data, centrally read imaging data and epro data.\n\nexceptions should be justified in the protocol e.g. if sharing this information with the investigator would jeopardise the blinding of the trial.\n\nthe sponsor should not have exclusive control of the data entered in a computerised system at any point in time. all data held by the sponsor that has been generated in a clinical trial should be verifiable to a copy of these data that is not held (or that has not been held) by the sponsor.\n\nthe requirements above are not met if data are captured in a computerised system and the data are stored on a central server under the sole control of the sponsor or under the control of a service provider that is not considered to be independent from the sponsor or if the sponsor (instead of the service provider) is distributing the data to the investigator. this is because the investigator does not hold an independent copy of the data and therefore the sponsor has exclusive control of the data. in order to meet the requirements, the investigator should be able to download a contemporaneous certified copy of the data. this is in addition to the record maintained at a service provider.\n\ninstead of a system maintained by an independent service provider, the sponsor may take other adequate technical measures that preclude sole control. e.g. the verifiability of data (transactions) by an independent (distributed) tamper-proof ledger may provide comparable security to a system maintained by an independent service provider. this should be justified and documented.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9d28b341-1fd6-446e-a849-dd0a54593ced": {"__data__": {"id_": "9d28b341-1fd6-446e-a849-dd0a54593ced", "embedding": null, "metadata": {"page_label": "23", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Investigator Control and Data Security in Cloud Solutions for Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. How can investigators ensure continued access to data entered into data acquisition tools throughout the legally mandated duration and local legal requirements, especially in the context of database lock processes?\n \n2. What specific considerations should be made when using cloud solutions for hosting clinical trial data, particularly regarding the qualification of the service provider and the management of data jurisdiction across different regions?\n\n3. In the scenario of investigator-initiated trials, what measures should be taken to maintain a degree of independence between the data hosting entity and the operational aspects of the trial, and how should this be documented in agreements?", "prev_section_summary": "This section discusses the creation of certified copies in clinical trials, considering the nature of the original document (static or dynamic). It also addresses the control of data generated at clinical trial sites, emphasizing the investigator's access to participant data and the sponsor's control over the data. The section highlights the importance of ensuring data verifiability and security, especially when the investigator does not hold an independent copy of the data. It suggests alternatives such as using independent service providers or implementing technical measures like tamper-proof ledgers to meet data control requirements.", "excerpt_keywords": "clinical trials, investigator control, data security, cloud solutions, data jurisdiction"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\ndata entered to data acquisition tools by the investigator should be available to the investigator throughout the whole legally mandated duration and for the full duration of local legal requirements. this can be ensured either by contemporaneous local copies at the trial site or by the use of a service provider. access to the data may be amended to read-only as part of the database lock process. prior to read-only access to the investigator being revoked, a copy including the audit trail should be made available to the investigator in a complete and comprehensive way. in the situation where a service provider is hosting the data, the copy should not be provided via the sponsor, as this would temporarily provide the sponsor with exclusive control over the data and thereby jeopardize the investigators control. copies should not be provided in a way that requires advanced technical skills from the investigators. the period between the provision of the copy to the investigator and the closure of the investigators read-only access to the database(s) should allow sufficient time for the investigator to review the copy, and access should not be revoked until such a review has been performed.\n\nany contractual agreements regarding hosting should ensure investigator control. if the sponsor is arranging hosting on behalf of the investigators through a service provider, agreements should ensure the level of investigator control mentioned above. investigators delegating hosting of such data to service providers themselves should ensure that the intended use is covered by local legal requirements and the in-house rules of the institution.\n\nfor investigator-initiated trials, where the data are hosted somewhere in the sponsor/institution organization, the degree of independence should be justified and pre-specified in agreements, e.g., that it is a central it department, not otherwise involved in the operational aspects of the trial, hosting the data and providing copies to the participating investigators.\n\ncloud solutions\n\nirrespective of whether a computerized system is installed at the premises of the sponsor, investigator, another party involved in the trial, or whether it is made available by a service provider as a cloud solution, the requirements in this guideline are applicable. there are, however, specific points to be considered as described below.\n\ncloud solutions cover a wide variety of services related to the computerized systems used in clinical trials. these can range from infrastructure as a service (iaas) over platform as a service (paas) to software as a service (saas). it is common for these services that they provide the responsible party on-demand availability of computerized system resources over the internet, without having the need or even the possibility to directly manage these services.\n\nif a cloud solution is used, the responsible party should ensure that the service provider providing the cloud is qualified. when using cloud computing, the responsible parties are at a certain risk because many services are managed less visibly by the cloud provider. contractual obligations with the cloud solution provider should be detailed and explicit and refer to all ich e6 relevant topics and to all relevant legal requirements (see annex 1).\n\ndata jurisdiction may be complex given the nature of cloud solutions and services being shared over several sites, countries, and continents; however, any uncertainties should be addressed and solved by contractual obligations prior to the use of a cloud solution. if the responsible party chooses to perform their own validation of the computerized system, the cloud provider should make a test environment available that is identical to the production environment.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c4236f4b-aa15-4540-9680-b40477a0a139": {"__data__": {"id_": "c4236f4b-aa15-4540-9680-b40477a0a139", "embedding": null, "metadata": {"page_label": "24", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Management and Security Best Practices: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific recommendations does the EMA Guideline on computerised systems and electronic data in clinical trials provide for ensuring the security and integrity of data backups in clinical trials?\n \n2. How does the guideline suggest handling the migration of data, including individual safety reports, from one system to another to ensure that the integrity and quality of the data and metadata are not compromised during the process?\n\n3. What procedures and tests does the guideline recommend for validating the data migration process, and how should the complexity of the task and the verification of migrated data be approached according to the document?", "prev_section_summary": "This section discusses the importance of ensuring investigator control and data security in cloud solutions for clinical trials. Key topics include the availability of data to investigators throughout the duration of legal requirements, considerations for using cloud solutions for hosting clinical trial data, and maintaining independence between the data hosting entity and operational aspects of the trial. Entities mentioned include investigators, service providers, sponsors, and cloud solution providers. The section emphasizes the need for clear contractual agreements to ensure investigator control and compliance with legal requirements when using cloud solutions.", "excerpt_keywords": "EMA, Guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## 6.8. backup of data\n\ndata stored in a computerised system are susceptible to system malfunction, intended or unintended attempts to alter or destroy data and physical destruction of media and infrastructure and are therefore at risk of loss. data and configurations should be regularly backed up. please also refer to annex 4 for further details on it security.\n\nthe use of replicated servers is strongly recommended. backups should be stored in separate physical locations and logical networks and not behind the same firewall as the original data to avoid simultaneous destruction or alteration.\n\nfrequency of backups (e.g. hourly, daily, weekly) and their retention (e.g. a day, a week, a month) should be determined through a risk-based approach.\n\nchecks of accessibility to data, irrespective of format, including relevant metadata, should be undertaken to confirm that the data are enduring, continue to be available, readable and understandable by a human being. there should be procedures in place for risk-based (e.g. in connection with major updates) restore tests from the backup of the complete database(s) and configurations and the performed restore tests should be documented.\n\ndisaster mitigation and recovery plans should be in place to deal with events that endanger data security. such plans should be regularly reviewed. disaster mitigation and recovery plans should be part of the contractual agreement, if applicable.\n\n## 6.9. contingency plans\n\nagreements and procedures should be in place to allow trial continuation and prevent loss of data critical to participant safety and trial results.\n\n## 6.10. migration of data\n\nmigration as opposed to the transfer of data (as described in section 6.1.2.) is the process of permanently moving existing data (including metadata) from one system into another system e.g. the migration of individual safety reports from one safety database to another. it should be ensured that the migration does not adversely affect existing data and metadata.\n\nin the course of the design or purchase of a new system and of subsequent data migration from an old system, validation of the data migration process should have no less focus than the validation of the system itself.\n\nthe validation of data migration should take into consideration the complexity of the task and any foreseen possibilities that may exist to verify the migrated data (e.g. checksum, case counts, quality control of records).\n\nprior to migration, the process should be planned in detail. a risk analysis identifying the most probable risks should take place and should yield appropriate mitigation strategies. after the planning, the intended procedure should be validated with mock data and results should be considered for risk-assessment and mitigation. a data verification focused on key data should be performed post migration.\n\nverification of migrated data can be simple or complex, depending on the different platforms and systems involved. regardless of the effort needed, the migration process should be documented in such detail that throughout all data operations/transformations data changes remain traceable. mapping from the old system onto the new system should be retained.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "baaaef3c-1c79-4e88-a1d4-178c9e53f773": {"__data__": {"id_": "baaaef3c-1c79-4e88-a1d4-178c9e53f773", "embedding": null, "metadata": {"page_label": "25", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Management and Retention in Clinical Trials: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific actions are recommended if the migration of data into a new system results in the loss of relevant data or the separation of the audit trail from the data in clinical trials?\n \n2. How should clinical trial investigators and sponsors manage the archiving of trial data and essential documents to comply with data protection principles and regulatory requirements on data retention periods?\n\n3. What considerations should be taken into account when deciding to decommission a database after the conclusion of a clinical trial, especially in relation to future marketing authorization applications?", "prev_section_summary": "The section discusses the importance of data backup, contingency plans, and data migration in clinical trials to ensure data security and integrity. Key topics include the frequency and retention of backups, use of replicated servers, disaster mitigation and recovery plans, and validation of the data migration process. Entities mentioned include data, metadata, systems, migration process, risk analysis, validation, and verification of migrated data. The section emphasizes the need for detailed planning, risk assessment, and documentation throughout the data operations to maintain data traceability.", "excerpt_keywords": "data migration, audit trail, archiving, retention periods, database decommissioning"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## data migration and audit trail\n\ndata, contextual information, and the audit trail should not be separated. in case migration of data into a new system results in a loss of relevant data, adequate mitigating actions should be taken to establish a robust method to join the audit trail and the data for continuous access by all stakeholders. a detailed explanation is expected if no such method has been established to allow the migration of data and the audit trail. arrangements should ensure that the link between data and metadata can be established. if several parties are involved, agreements should be in place to ensure this.\n\n## archiving\n\nthe investigator and sponsor should be aware of the required retention periods for clinical trial data and essential documents, including metadata. retention periods should respect the data protection principle of storage limitation. an inventory of all essential data and documents and corresponding retention periods should be maintained. it should be clearly defined which data are related to each clinical trial activity and where this record is located and who has access/edit rights to the document. security controls should be in place to ensure data confidentiality, integrity, and availability. it should be ensured that the file and any software required (depending on the media used for storage) remain accessible throughout the retention period. suitable archiving systems should be in place to safeguard data integrity for the periods established by the regulatory requirements.\n\n## database decommissioning\n\nafter the finalization of the trial, database(s) might be decommissioned. it is recommended that the time of decommissioning is decided taking into consideration whether the clinical trial will be used for a marketing authorization application in the near future, in which case it is recommended to keep the database(s) live. a dated and certified copy of the database(s) and data should be archived and available on request. in case of decommissioning, the sponsor should ensure that archived formats provide the possibility to restore the database(s) along with all relevant metadata. the sponsor should review the system to determine the audit trails and logs available in the system and how these would be retained as dynamic files.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8f898e53-daaf-41b6-a319-7adf41fb7b8e": {"__data__": {"id_": "8f898e53-daaf-41b6-a319-7adf41fb7b8e", "embedding": null, "metadata": {"page_label": "26", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Managing Data Retention and Migration for Sponsorship Compliance", "questions_this_excerpt_can_answer": "1. What is the minimum duration for data retention by sponsors as mandated by Regulation (EU) No 536/2014 in the context of clinical trials?\n \n2. How does the document describe the process and requirements for managing electronic data during the live phase, including report generation, archiving, and destruction, in clinical trials?\n\n3. What guidance does the document provide regarding software and media migrations in the context of maintaining compliance with data integrity and retention requirements in clinical trials?", "prev_section_summary": "The key topics of the section include data migration and audit trail, archiving of clinical trial data and essential documents, and database decommissioning. Entities mentioned in the section include stakeholders, investigators, sponsors, data, metadata, retention periods, security controls, archiving systems, database(s), and audit trails. The section emphasizes the importance of maintaining the link between data and metadata, complying with data protection principles, and ensuring data integrity throughout the retention period. It also highlights the considerations for decommissioning databases after the conclusion of a clinical trial, especially in relation to future marketing authorization applications.", "excerpt_keywords": "data retention, sponsor, regulation (EU) No 536/2014, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n|data retention by sponsor|\n|---|\n|time|4|\n|at least 25 years in regulation (eu) no 536/201477|\n|submission|\n|live phase|report|and|archiving|destruction|\n|locked system|dynamic and flat files|\n|active system|allow recommision|\n|figure 2|\n|software and media migrations|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "62488ccd-7bb3-4b5e-9b7f-66ceb3073cf9": {"__data__": {"id_": "62488ccd-7bb3-4b5e-9b7f-66ceb3073cf9", "embedding": null, "metadata": {"page_label": "27", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Clinical Trial Management and Computerised Systems: Responsibilities and Agreements", "questions_this_excerpt_can_answer": "1. What responsibilities do sponsors and investigators have regarding the delegation of tasks related to computerised systems in clinical trials, according to the EMA guideline?\n \n2. How does the EMA guideline suggest handling agreements for the use of non-trial specific computerised systems in clinical trials, and what is required from the responsible party in such cases?\n\n3. According to the EMA guideline, what measures should be taken if a service provider is unwilling to support pre-qualification audits or regulatory inspections for their computerised systems used in clinical trials?", "prev_section_summary": "The section discusses the minimum duration for data retention by sponsors as mandated by Regulation (EU) No 536/2014 in the context of clinical trials, the process and requirements for managing electronic data during the live phase of clinical trials, including report generation, archiving, and destruction, and guidance on software and media migrations to maintain compliance with data integrity and retention requirements. Key topics include data retention, live phase management, report generation, archiving, destruction, software migrations, and media migrations. Key entities mentioned are Regulation (EU) No 536/2014, dynamic and flat files, locked system, active system, and software and media migrations.", "excerpt_keywords": "Clinical trials, Computerised systems, Responsibilities, Agreements, Data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 1 agreements\n\nthe legally responsible parties are the sponsors and investigators. they contract/delegate an increasing number of tasks in clinical trials, contracting is frequent in the area of computerised systems where the responsible party might lack internal knowledge or resources or they wish to purchase a product or a service that has been developed by others. the responsible parties can delegate tasks to a service provider, but nevertheless the full responsibility for the data integrity, security and confidentiality resides with them.\n\nagreements can cover a variety of tasks such as system and trial specific configuration and customisation, provision of a license to an application, full clinical trial service including data management tasks e.g. site contact, training, data clarification processes, etc., but could also be restricted to hosting services. a risk-based approach can be used in relation to agreements as well as for computerised systems in general. it is recognised that a trial specific agreement is not required, if a product is purchased and used as intended without the involvement of the manufacturer of the system; however, such use will require a risk assessment by the responsible party to assess whether such a non-trial specific system is fit for its intended use.\n\nthe responsible party should ensure that the distribution of tasks in a trial is clearly documented and agreed on. it should be ensured that each party has the control of and access to data and information that their legal responsibilities require and that the ethics committees and regulatory authorities approving trials have been properly informed of distribution of activities as part of the clinical trial application process, where applicable. this should be carefully documented in the protocol and related documents, procedures, agreements, and other documents as relevant. it is important to consider who is providing and controlling the computerised system being used.\n\nclear written agreements should be in place and appropriately signed by all involved parties prior to the provision of services or systems. agreements should be maintained/updated as appropriate. sub-contracting and conditions for sub-contracting and the responsible partys oversight of sub-contracted activities should be specified.\n\nthe responsible parties should ensure oversight of these trial-related duties e.g. by reviewing defined key performance indicators (kpis) or reconciliations.\n\nif appropriate agreements cannot be put in place due to the inability or reluctance of a service provider to allow access to important documentation (e.g. system requirements specifications) or the service provider is unwilling to support pre-qualification audits or regulatory inspections, systems from such a service provider should not be used in clinical trials.\n\nthe responsible party should ensure that service providers (including vendors of computerised systems) have the knowledge and the processes to ensure that they can perform their tasks in accordance with ich e6, as appropriate to their tasks. standards to be followed, e.g. clinical trial legislation and guidance should be specified in the agreement, where relevant. a number of tasks involve accessing, reviewing, collecting and/or analysing data, much of which is personal/pseudonymised data. in addition, in specific cases involving contact with (potential) trial participants, data protection legislation needs to be followed, in addition to the clinical trial legislation and guidance.\n\nthe approved protocol, implicitly, defines part of the specification for system configuration or customisation (e.g. for interactive response technologies (irt) systems and data acquisition tools) and there should be consistency between the protocol and the wording of the agreement. in addition, it should be clear how subsequent changes to the protocol are handled so that the vendor can implement changes to the computerised system, where relevant.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "39f878a5-cccc-449c-9f57-9546a2b34705": {"__data__": {"id_": "39f878a5-cccc-449c-9f57-9546a2b34705", "embedding": null, "metadata": {"page_label": "28", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Best Practices for Ensuring Compliance and Accountability in Clinical Trial Documentation and Data Management\"", "questions_this_excerpt_can_answer": "1. What specific types of documentation related to computerised systems and trial master file (TMF) should be retained throughout the full retention period in clinical trials, according to the EMA guideline?\n \n2. How does the EMA guideline address the issue of data and system documentation access for GCP inspectors from EU/EEA authorities, especially in terms of validation and operation of computerised systems used in clinical trials?\n\n3. What are the responsibilities of sponsors regarding the reporting of serious breaches, including data and security breaches, under Regulation (EU) No 536/2014 as outlined in the EMA guideline on computerised systems and electronic data in clinical trials?", "prev_section_summary": "The section discusses the responsibilities and agreements related to the use of computerised systems in clinical trials as outlined in the EMA guideline. Key topics include delegation of tasks, agreements between sponsors/investigators and service providers, data integrity, security, confidentiality, risk assessment, distribution of tasks, oversight of sub-contracted activities, importance of clear written agreements, compliance with regulations and standards, and handling of personal data. Key entities mentioned are sponsors, investigators, service providers, ethics committees, regulatory authorities, vendors of computerised systems, and trial participants.", "excerpt_keywords": "Clinical trials, EMA guideline, Data management, Compliance, Documentation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nit should be clear from agreements which tasks are delegated also in relation to retaining essential documentation for performed activities. in the context of clinical trials, system-documentation (including e.g. software/system validation documentation, vendor standard operating procedures (sops), training records, issues log/resolutions) as well as trial master file (tmf) documentation (e.g. emails on important decisions and meeting minutes) related to the individual clinical trial (including e.g. relevant helpdesk tickets or meeting minutes) should be retained for the full retention period. it should be clear from the agreement which party is retaining and maintaining which documentation and how and in what format that documentation is made available when needed e.g. for an audit or an inspection. there should be no difference in the availability of documentation irrespective of whether the documentation is held by the sponsor/investigator or a service provider or sub-contracted party.\n\nthe responsible party is ultimately responsible for e.g. the validation and operation of the computerised system and for providing adequate documented evidence of applicable processes.\n\nthe responsible party should be able to provide the gcp inspectors of the eu/eea authorities with access to the requested documentation regarding the validation and operation of computerised systems irrespective of who performed these activities.\n\nit should be specified in agreements that the sponsor or the institution, as applicable, should have the right to conduct audits at the vendor site and that the vendor site could be subject to inspections (by national and/or international authorities) and that the vendor site shall accept these. the responsible party should also ensure that their service providers act on/respond appropriately to findings from audits and inspections.\n\nthe sponsor has a legal responsibility under regulation (eu) no 536/2014 to report serious breaches, including important data and security breaches, to authorities within seven days. to avoid undue delay in sponsor reporting from the time of discovery e.g. by a vendor, agreements and related documents should specify which information should be escalated immediately to ensure regulatory compliance.\n\nas set out in ich e6, to ensure that the investigator, rather than the sponsor, maintains control over their data, it should be specified in agreements how investigators access to and control over data are ensured during and after the trial, and the revocation of investigator access to data in case of decommissioning should be described. it should also be specified which outputs the involved parties (e.g. sponsor and investigators) will receive during and after the clinical trial and in what formats. types of output could include e.g. data collected via data acquisition tools including metadata, queries, history and status of changes to users and their access rights, and the description of format for delivery of the complete database to sponsors.\n\narrangements on the decommissioning of the database(s) should be clear, including the possibility to restore the database(s), for instance, for inspection purposes.\n\nthe agreements should address expectations regarding potential system down-time and the preparation of contingency plans.\n\ntasks transferred/delegated could include hosting of data. if data are hosted by a vendor, location of data storage and control (e.g. use of cloud services) should be described.\n\nagreements should ensure reliable, continued and timely access to the data in case of bankruptcy, shutdown, disaster of the vendor, discontinuation of service by the vendor or for reasons chosen by the sponsor/investigator (e.g. change of vendor).\n\nspecial consideration should be given on training and quality systems. vendors accepting tasks on computerised systems should not only be knowledgeable about computerised systems and data protection legislation, but also on gcp requirements, quality systems, etc. as appropriate to the tasks they perform.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "21c1c138-1807-4e6f-9f73-ae4166f1a5fd": {"__data__": {"id_": "21c1c138-1807-4e6f-9f73-ae4166f1a5fd", "embedding": null, "metadata": {"page_label": "29", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guidelines for Sponsors on Computerised Systems in Clinical Trials: Ensuring Compliance and Data Integrity", "questions_this_excerpt_can_answer": "1. What specific document should be consulted in conjunction with the \"Guidelines for Sponsors on Computerised Systems in Clinical Trials: Ensuring Compliance and Data Integrity\" to ensure comprehensive understanding of the EMA's expectations regarding computerised systems in clinical trials?\n\n2. As of what date was the notice to sponsors regarding computerised systems, referenced in the \"Guidelines for Sponsors on Computerised Systems in Clinical Trials: Ensuring Compliance and Data Integrity,\" published on the EMA website?\n\n3. Can you identify a resource that provides additional context or requirements related to the use of computerised systems in clinical trials as outlined by the European Medicines Agency (EMA)?", "prev_section_summary": "The section discusses the importance of retaining essential documentation related to computerized systems and trial master files in clinical trials, specifying responsibilities for validation and operation of computerized systems, access to documentation for GCP inspectors, reporting of serious breaches, and ensuring investigator control over data. It also addresses agreements on audits, inspections, data access, system downtime, contingency plans, data hosting, training, and quality systems for vendors involved in clinical trials. Key entities mentioned include sponsors, investigators, vendors, and regulatory authorities.", "excerpt_keywords": "EMA, guidelines, sponsors, computerised systems, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nthis guideline should be read together with the notice to sponsors regarding computerised systems (ema/ins/gcp/467532/2019) published on the ema website.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "93348fe5-9a2f-4a5e-9932-1bef47778c75": {"__data__": {"id_": "93348fe5-9a2f-4a5e-9932-1bef47778c75", "embedding": null, "metadata": {"page_label": "30", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Computerised Systems Validation in Clinical Trials: Responsibilities, Documentation, and Interfaces.", "questions_this_excerpt_can_answer": "1. What are the responsibilities of the responsible party regarding the validation of computerised systems used in clinical trials, as outlined in the EMA Guideline on computerised systems and electronic data?\n \n2. How should the responsible party approach using a vendor's validation documentation for computerised systems in clinical trials, according to the guidelines provided in the document?\n\n3. What steps should be taken if a service provider releases a new or updated version of a system at short notice, according to the guidelines on computerised systems validation in clinical trials?", "prev_section_summary": "The section discusses the guidelines for sponsors on computerised systems in clinical trials, emphasizing the importance of ensuring compliance and data integrity. It mentions the need to consult the notice to sponsors regarding computerised systems published on the EMA website for a comprehensive understanding of the EMA's expectations. The European Medicines Agency (EMA) is highlighted as the regulatory body overseeing the use of computerised systems in clinical trials.", "excerpt_keywords": "Computerised Systems Validation, Clinical Trials, EMA Guideline, Electronic Data, Responsibilities"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 2 computerised systems validation\n\na2.1 general principles\n\nthe responsible party should ensure that systems used in clinical trials have been appropriately validated and demonstrated to meet the requirements defined in ich e6 and in this guideline.\n\nsystems should be validated independently of whether they are developed on request by the responsible party, are commercially or freely available, or are provided as a service.\n\nthe responsible party may rely on validation documentation provided by the vendor of a system if they have assessed the validation activities performed by the vendor and the associated documentation as adequate; however, they may also have to perform additional validation activities based on a documented assessment. in any case, the responsible party remains ultimately responsible for the validation of the computerised systems used in clinical trials.\n\nif the responsible party wants to use the vendors validation documentation, the responsible party should ensure that it covers the responsible partys intended use as well as its defined needs and requirements.\n\nthe responsible party should be thoroughly familiar with the vendors quality system and validation activities, which can usually be obtained through an in-depth systematic examination (e.g. an audit).\n\nthis examination should be performed by qualified staff with sufficient time spent on the activities and with cooperation from the vendor. it should go sufficiently deep into the actual activities, and a suitable number of relevant key requirements and corresponding test cases should be reviewed, and this review should be documented. the examination report should document that the vendors validation process and documentation is satisfactory. any shortcomings should be mitigated by the responsible party, e.g. by requesting or performing additional validation activities.\n\nsome service providers may release new or updated versions of a system at short notice, leaving insufficient time for the responsible party to validate it or to review any validation documentation supplied by the service provider. in such a situation, it is particularly important for the responsible party to evaluate the vendors process for validation prior to release for production, and to strengthen their own periodic review and change control processes. new functionalities should not be used by the responsible party until they have validated them or reviewed and assessed the vendors documentation.\n\nif the responsible party relies on the vendors validation documentation, inspectors should be given access to the full documentation and reporting of the responsible partys examination of the vendor. if this examination is documented in an audit report, this may require providing access to the report. the responsible party, or where applicable, the service provider performing the examination activities on their behalf, should have a detailed understanding of the validation documentation.\n\nas described in annex 1 on agreements, the validation documentation should be made available to the inspectors in a timely manner, irrespective of whether it is provided by the responsible party or the vendor of the system. contractual arrangements should be made to ensure continued access to this documentation for the legally defined retention period even if the sponsor discontinues the use of the system or if the vendor discontinues to support the system or ceases its activities.\n\nin case the vendors validation activities and documentation are insufficient, or if the responsible party cannot rely on the vendor to provide documentation, the responsible party should validate the system. any difference between the test and the production configuration and environment should be documented and its significance assessed and justified.\n\ninterfaces between systems should be clearly defined and validated e.g. transfer of data from one system to another.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "2e8337b9-dd16-4a80-a467-f638c560d4fe": {"__data__": {"id_": "2e8337b9-dd16-4a80-a467-f638c560d4fe", "embedding": null, "metadata": {"page_label": "31", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Validation and Traceability in Clinical Trial Systems", "questions_this_excerpt_can_answer": "1. How does the EMA Guideline on computerised systems and electronic data in clinical trials define the scope and content of user requirements for clinical trial systems, and what specific types of requirements should be included to ensure compliance with ICH E6 and data integrity?\n\n2. What steps and considerations are outlined in the EMA Guideline for the process of system configuration and customisation specific to a clinical trial, including the handling of modifications due to protocol amendments?\n\n3. According to the EMA Guideline, how should traceability between user requirements and test cases or other relevant documents and activities be established and maintained to ensure compliance and integrity throughout a clinical trial system's lifecycle?", "prev_section_summary": "The section discusses the general principles of computerised systems validation in clinical trials as outlined in the EMA Guideline. Key topics include the responsibilities of the responsible party, reliance on vendor validation documentation, handling new system versions from service providers, thorough examination of vendors' quality systems, periodic review and change control processes, access to validation documentation for inspectors, and validation of system interfaces. The responsible party is ultimately responsible for ensuring that systems used in clinical trials are appropriately validated and meet defined requirements.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, validation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## a2.2 user requirements\n\ncritical system functionality implemented and used in a clinical trial should be described in a set of user requirements or use cases, e.g. in a user requirements specification (urs). this includes all functionalities, which ensure trial conduct in compliance with ich e6 and which include capturing, analysing, reporting and archiving clinical trial data in a manner that ensures data integrity. user requirements should include, but may not be limited to operational, functional, data integrity, technical, interface, performance, availability, security, and regulatory requirements. the above applies independently of the sourcing strategy of the responsible party or the process used to develop the system.\n\nwhere relevant, user requirements should form the basis for system design, purchase, configuration, and customisation; but in any case, they should constitute the basis for system validation.\n\nthe responsible party should adopt and take full ownership of the user requirements, whether they are documented by the responsible party, by a vendor or by a service provider. the responsible party should review and approve the user requirements in order to verify that they describe the functionalities needed by users in their particular clinical trials.\n\nuser requirements should be maintained and updated as applicable throughout a systems lifecycle when system functionalities are changed.\n\n## a2.3 trial specific configuration and customisation\n\nthe configuration and customisation of a system for use in a specific trial should be pre-specified, documented in detail and verified as consistent with the protocol, with the data management plan and other related documents. trial specific configuration and customisation should be quality controlled and tested as applicable before release for production. it is recommended to involve users in the testing activities. the same process applies to modifications required by protocol amendments.\n\nif modifications to a system are introduced due to a protocol amendment, e.g. to collect additional information, it should be determined whether they should be applied to all trial participants or only to those concerned by the amendment.\n\nif new functionalities or interfaces need to be developed, or new code added, they should be validated before use.\n\n## a2.4 traceability of requirements\n\ntraceability should be established and maintained between each user requirement and test cases or other documents or activities, such as standard operating procedures, as applicable. this traceability may have many forms and the process may be automated by software. it should be continuously updated as requirements are changed to ensure that where applicable, for every requirement, there is a corresponding test case or action, in line with the risk evaluation.\n\n## a2.5 validation and test plans\n\nvalidation activities should be planned, documented, and approved. the validation plan should include information on the validation methodology, the risk-based approach taken and if applicable, the division of tasks between the responsible party and a service provider. prior to testing, the risk assessment should define which requirements and tests are related to critical system functionality.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "78c8e36a-7a5d-4eff-8ff0-bce6a37eefa2": {"__data__": {"id_": "78c8e36a-7a5d-4eff-8ff0-bce6a37eefa2", "embedding": null, "metadata": {"page_label": "32", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Software Testing and Release Management Best Practices Guide", "questions_this_excerpt_can_answer": "1. What are the key elements that should be included in test cases for software testing in clinical trials according to the EMA Guideline on computerised systems and electronic data?\n \n2. How does the EMA Guideline suggest handling deviations encountered during system validation in clinical trial software testing, and what steps should be taken before the software is released for production?\n\n3. What specific documentation and procedures does the EMA Guideline recommend for the test execution phase in the context of software testing and release management in clinical trials?", "prev_section_summary": "This section discusses the importance of user requirements, system configuration and customisation, traceability of requirements, and validation and test plans in clinical trial systems. Key topics include the description of critical system functionality in user requirements, the process of system configuration and customisation specific to a trial, the establishment and maintenance of traceability between requirements and test cases, and the planning and documentation of validation activities. The responsible party is emphasized to take ownership of user requirements and ensure compliance with ICH E6 and data integrity throughout the system's lifecycle.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, software testing"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\ntest cases should be pre-approved. they may have many formats and while historically consisting of textual documents including tables with multiple columns corresponding to the elements below, they may also be designed and contained in dedicated test management systems, which may even allow automatic execution of test cases (e.g. regression testing). however, expectations to key elements are the same.\n\ntest cases should include:\n\n- the version of the software being tested;\n- any pre-requisites or conditions prior to conducting the test;\n- a description of the steps taken to test the functionality (input);\n- the expected result (acceptance criteria).\n\ntest cases should require the tester to document the actual result as seen in the test step, the evidence if relevant and, if applicable, the conclusion of the test step (pass/fail). where possible, the tester should not be the author of the test case. in case of test failure, the potential impact should be assessed and subsequent decisions regarding the deviations should be documented.\n\na2.6 test execution and reporting\n\ntest execution should follow approved protocols and test cases (see section a2.5), the version of the software being tested should be documented, and where applicable and required by test cases and test procedures, evidence (e.g. screen shots) should be captured to document test steps and results. where relevant, the access rights (role) and the identification of the person or automatic testing tool performing tests should be documented.\n\nwhere previously passed scripts are not retested along with the testing of fixes for previous failing tests, this should be risk assessed and the rationale should be documented.\n\ndeviations encountered during system validation should be recorded and brought to closure. any failure to meet requirements pre-defined to be critical should be solved or mitigating actions should be implemented prior to deployment. all open deviations and any known issues with the system at the time of release should be assessed and subsequent decisions should be documented in the validation report and, if applicable, in the release notes. the validation report should be approved by the responsible party before release for production.\n\na2.7 release for production\n\nthe responsible party should sign off the release prior to initial use. training materials, user guides and any other resources required for users should be available at the time of release.\n\na2.8 user helpdesk\n\nthere should be a mechanism to report, record, and solve defects and issues raised by the users e.g. via a helpdesk. defects and issues should be fixed in a timely manner.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "6acbcf9d-f76f-4c7d-8373-d88059f7c7c4": {"__data__": {"id_": "6acbcf9d-f76f-4c7d-8373-d88059f7c7c4", "embedding": null, "metadata": {"page_label": "33", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "System Validation and Change Control in Clinical Trials: Ensuring Compliance and Data Integrity", "questions_this_excerpt_can_answer": "1. What specific elements should be evaluated during the periodic review of a system's validation in clinical trials to ensure it remains in a validated state according to the EMA guideline?\n \n2. How does the EMA guideline recommend handling change control in computerised systems used in clinical trials, particularly in relation to documenting and authorizing changes, assessing their impact, and ensuring compliance with regulatory requirements?\n\n3. According to the EMA guideline, what steps should be taken to maintain data integrity and regulatory compliance when implementing changes to a computerised system used in clinical trials, including the management of documentation and validation activities?", "prev_section_summary": "The section discusses key elements that should be included in test cases for software testing in clinical trials according to the EMA Guideline on computerised systems and electronic data. It also covers how deviations encountered during system validation should be handled, steps to be taken before software is released for production, documentation and procedures recommended for the test execution phase, test case formats, test execution protocols, capturing evidence, risk assessment for retesting previously passed scripts, handling of deviations, release for production process, sign-off requirements, availability of training materials and user guides, and the importance of a user helpdesk for reporting and solving defects and issues raised by users.", "excerpt_keywords": "System Validation, Change Control, Clinical Trials, Data Integrity, EMA Guideline"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## a2.9 periodic review\n\nvalidation of a system should be maintained throughout the full system life cycle. periodic system reviews should be conducted to assess and document whether the system can still be considered to be in a validated state, or whether individual parts or the whole system needs re-validation. depending on the system type and application, the following elements (non-exhaustive list) should be evaluated and concluded, both individually and in combination:\n\n- changes to hardware/infrastructure;\n- changes to operating system/platform;\n- changes to the application;\n- changes to security procedures;\n- changes to backup and restore tools and procedures;\n- configurations or customizations;\n- deviations (or recurrence thereof);\n- performance incidents;\n- security incidents;\n- open and newly identified risks;\n- new regulation;\n- review of system accesses;\n- updates of agreements with the service provider.\n\nthese elements should be reviewed whether the system is hosted by the responsible party or by a service provider.\n\n## a2.10 change control\n\nthere should be a formal change control process. requests for change should be documented and authorized and should include details of the change, risk-assessment (e.g. for data integrity, current functionalities and regulatory compliance), impact on the validated state and testing requirements. for trial specific configurations and customizations, the change request should include the details of the protocol amendment if applicable.\n\nas part of the change control process, all documentation should be updated as appropriate (e.g. requirements, test scripts, training materials, user guide) and a report of the validation activities prepared and approved prior to release for production. the system should be version controlled.\n\nthe responsible party should ensure that any changes to the system do not result in data integrity or safety issues or interfere with the conduct of an ongoing trial. the investigator should be clearly informed of any change to a form (e.g. electronic case report form [ecrf] or electronic clinical outcome assessment [ecoa] page) and it should be clear when such changes were implemented.\n\nthe documentation relating to the validation of previous or discontinued system versions used in a clinical trial should be retained (see guideline on the content, management and archiving of the clinical trial master file (paper and/or electronic) [ema/ins/gcp/856758/2018], section 6.3).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9af98ce5-97de-4a99-a2cf-c75acc86eeee": {"__data__": {"id_": "9af98ce5-97de-4a99-a2cf-c75acc86eeee", "embedding": null, "metadata": {"page_label": "34", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Effective User Management in Clinical Trial Projects: Strategies for Success", "questions_this_excerpt_can_answer": "1. What specific measures does the EMA guideline recommend for managing user access in clinical trial computer systems to ensure compliance with data integrity and ICH E6 principles?\n \n2. How does the EMA guideline propose to handle the segregation of duties and privileged access within computerised systems used in clinical trials to prevent misuse and ensure the integrity of trial data?\n\n3. According to the EMA guideline, what is the stance on account sharing within clinical trial computer systems, and what principle does this stance aim to uphold in the context of data integrity and regulatory compliance?", "prev_section_summary": "This section discusses the importance of periodic review in maintaining the validation of computerized systems used in clinical trials, as well as the need for a formal change control process. Key topics include elements to be evaluated during periodic reviews such as changes to hardware, operating system, application, security procedures, and more. The change control process involves documenting and authorizing changes, assessing their impact on data integrity and regulatory compliance, updating documentation, and ensuring that changes do not compromise data integrity or safety. The section also emphasizes the importance of retaining documentation related to previous system versions used in clinical trials.", "excerpt_keywords": "EMA guideline, user management, clinical trials, data integrity, segregation of duties"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 3 user management\n\na3.1 user management\n\norganisations should have a documented process in place to grant, change and revoke system accesses in a timely manner as people start, change, and end their involvement/responsibility in the management and/or conduct of the clinical trial projects. access to the system should only be granted to trained site users when all the necessary approvals for the clinical trial have been received and all documentation is in place (e.g. signed protocol and signed agreement with the investigator). this also applies to any updates to the system, e.g. changes resulting from a protocol amendment should only be made available to users once it is confirmed that the necessary approvals have been obtained, except where necessary to eliminate an immediate hazard to trial participants.\n\na3.2 user reviews\n\nat any given time, an overview of current and previous access, roles and permissions should be available from the system. this information concerning actual users and their privileges to systems should be verified at suitable intervals to ensure that only necessary and approved users have access and that their roles and permissions are appropriate. there should be timely removal of access no longer required, or no longer permitted.\n\na3.3 segregation of duties\n\nsystem access should be granted based on a segregation of duties and also the responsibilities of the investigator and the sponsor, as outlined in ich e6. users with privileged or admin access have extensive rights in the system (operating system or application), including but not limited to changing any system setting (e.g. system time), defining or deactivating users (incl. admin users), activate or deactivate audit trail functionality (and sometimes even edit audit trail information) and making changes to data that are not captured in the audit trail [e.g. backend table changes in the database(s)]). there is a risk that these privileges can be misused. consequently, users with privileged access should be sufficiently independent from and not be involved in the management and conduct of the clinical trial and in the generation, modification, and review of data. users of computer clients [e.g. personal computer (pc)] which record or contain critical clinical trial data, should generally not have admin access to the relevant equipment and when this is not the case, it needs to be justified. unblinded information should only be accessible to pre-identified user roles.\n\na3.4 least-privilege rule\n\nsystem access should be assigned according to the least-privilege rule, i.e. users should have the fewest privileges and access rights for them to undertake their required duties for as short a time as necessary.\n\na3.5 individual accounts\n\nall system users should have individual accounts. sharing of accounts (group accounts) is considered unacceptable and a violation of data integrity and ich e6 principles as data should be attributable.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "58f3bff3-d0d5-4f4a-97b5-f22178960d2a": {"__data__": {"id_": "58f3bff3-d0d5-4f4a-97b5-f22178960d2a", "embedding": null, "metadata": {"page_label": "35", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Unique Usernames and Ownership in the System: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What guidelines does the EMA provide regarding the uniqueness and traceability of user access within computerised systems used in clinical trials?\n \n2. How does the EMA guideline suggest distinguishing between accounts intended for interactive human use and machine accounts in the context of electronic data management in clinical trials?\n\n3. According to the EMA guideline on computerised systems and electronic data in clinical trials, what is the importance of having a named owner for each user account within the system?", "prev_section_summary": "The section discusses user management in clinical trial computer systems as outlined in the EMA guideline. Key topics include the need for a documented process to grant, change, and revoke system accesses, user reviews to ensure only necessary and approved users have access, segregation of duties to prevent misuse of privileges, the least-privilege rule for assigning system access, and the importance of individual accounts to maintain data integrity and regulatory compliance. The section emphasizes the importance of managing user access effectively to ensure compliance with data integrity and ICH E6 principles in clinical trial projects.", "excerpt_keywords": "EMA, guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n### a3.6 unique usernames\n\nuser access should be unique within the system and across the full life cycle of the system. user account names should be traceable to a named owner and accounts intended for interactive use and those assigned to human users should be readily distinguishable from machine accounts.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "38e69b9e-6861-4950-b716-d001d41443e9": {"__data__": {"id_": "38e69b9e-6861-4950-b716-d001d41443e9", "embedding": null, "metadata": {"page_label": "36", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Implementing Robust Security Protocols for the Protection of Clinical Trial Data\"", "questions_this_excerpt_can_answer": "1. What specific security measures does the EMA Guideline recommend for protecting physical computerised systems and data centres containing clinical trial data against unauthorized access and physical threats?\n \n2. How does the EMA Guideline suggest managing firewall settings to ensure the security of clinical trial data against unauthorized external network access, and what periodic action is recommended to maintain their effectiveness?\n\n3. According to the EMA Guideline, what steps should be taken for vulnerability management in computer systems used in clinical trials to prevent unauthorized actions such as data modification or making data inaccessible to legitimate users?", "prev_section_summary": "The section discusses the importance of ensuring unique usernames within computerised systems used in clinical trials, as outlined in the EMA guideline. It emphasizes the need for user access to be unique and traceable to a named owner throughout the system's life cycle. Additionally, the guideline suggests distinguishing between accounts intended for interactive human use and machine accounts to ensure proper electronic data management in clinical trials. Having a named owner for each user account is highlighted as crucial for accountability and traceability within the system.", "excerpt_keywords": "EMA, Guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 4 security\n\na4.1 ongoing security measures\n\nthe responsible party should maintain a security system that prevents unauthorised access to the data. threats and attacks on systems containing clinical trial data and corresponding measures to ensure security of such systems are constantly evolving, especially for systems and services being provided over or interfacing the internet.\n\na4.2 physical security\n\ncomputerised systems, servers, communication infrastructure and media containing clinical trial data should be protected against physical damage, unauthorised physical access, and unavailability. the extent of security measures depends on the criticality of the data. the responsible party should ensure an adequate level of security for data centres as well as for local hardware such as universal serial bus (usb) drives, hard disks, tablets, or laptops.\n\nat a data centre hosting clinical trial data, physical access should be limited to the necessary minimum and should generally be controlled by means of two-factor authentication. the data centre should be constructed to minimise the risk of flooding, there should be pest control and effective measures against fire, i.e. cooling, and fire detection and suppression. there should be emergency generators and uninterruptable power supplies (ups) together with redundant internet protocol providers. in case of co-location (see section 6.7 cloud solutions), the servers should be locked up and physically protected (e.g. in cages) to prevent access from other clients. media (e.g. hard disks) should be securely erased or destroyed before disposal.\n\ndata should be replicated at an appropriate frequency from the primary data centre to a secondary failover site at an adequate physical distance to minimise the risk that the same fire or disaster destroys both data centres. a disaster recovery plan should be in place and tested.\n\na4.3 firewalls\n\nin order to provide a barrier between a trusted internal network and an untrusted external network and to control incoming and outgoing network traffic (from certain ip addresses, destinations, protocols, applications, or ports etc.), firewall rules should be defined. these should be defined as strict as practically feasible, only allowing necessary and permissible traffic.\n\nas firewall settings tend to change over time (e.g. as software vendors and technicians need certain ports to be opened due to installation or maintenance of applications), firewall rules and settings should be periodically reviewed. this should ensure that firewall settings match approved firewall rules and the continued effectiveness of a firewall.\n\na4.4 vulnerability management\n\nvulnerabilities in computer systems can be exploited to perform unauthorised actions, such as modifying data or making data inaccessible to legitimate users. such exploitations could occur in operating systems for servers, computer clients, tablets and mobile phones, routers and platforms (e.g. databases). consequently, relevant security patches for platforms and operating systems should be applied in a timely manner, according to vendor recommendations.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a87953c8-dd45-4a8d-bf19-7b72c498b9d5": {"__data__": {"id_": "a87953c8-dd45-4a8d-bf19-7b72c498b9d5", "embedding": null, "metadata": {"page_label": "37", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Enhancing IT Security Measures in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific measures does the EMA Guideline recommend for managing unsupported platforms and operating systems in clinical trials to mitigate security risks?\n2. How does the EMA Guideline propose to ensure the compatibility of software used in clinical trials with updates to platforms and operating systems, and what are the recommended actions to avoid impacts on clinical trial management due to software incompatibilities?\n3. What are the recommended practices outlined in the EMA Guideline for conducting penetration testing on systems used in clinical trials that face the internet, and how should identified vulnerabilities be addressed?", "prev_section_summary": "The section discusses the importance of implementing robust security protocols for the protection of clinical trial data. It covers ongoing security measures, physical security measures for computerised systems and data centres, the use of firewalls to control network traffic, and vulnerability management to prevent unauthorized actions. Key entities mentioned include data centres, physical access control, two-factor authentication, firewall rules, vulnerability management, security patches, and disaster recovery planning.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, IT security"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nsystems, which are not security patched in a timely manner according to vendor recommendations, should be effectively isolated from computer networks and the internet, where relevant.\n\nplatform management\n\nplatforms and operating systems for critical applications and components should be updated in a timely manner according to vendor recommendations, in order to prevent their use in an unsupported state. unsupported platforms and operating systems, for which no security patches are available, are exposed to a higher risk of vulnerability. validation of applications on the new platforms and operating systems and of the migration of data should be planned ahead and completed in due time prior to the expiry of the supported state. unsupported platforms and operating systems should be effectively isolated from computer networks and the internet.\n\nit should be ensured that software used in clinical trials remains compatible with any changes to platforms/operating systems in order to avoid unintended impact on the conduct/management of the clinical trial due to interruption of functionality or requirements for alternative software and data migration.\n\nbi-directional devices\n\nthe use of bi-directional devices (e.g. usb devices), which come from or have been used outside the organisation, should be strictly controlled as they may intentionally or unintentionally introduce malware and impact data integrity, data availability, and rights of trial participants.\n\nanti-virus software\n\nanti-virus software should be installed and activated on systems used in clinical trials. the anti-virus software should be continuously updated with the most recent virus definitions in order to identify, quarantine, and remove known computer viruses. this should be monitored.\n\npenetration testing\n\nfor systems facing the internet, penetration testing should be conducted at regular intervals in order to evaluate the adequacy of security measures and identify vulnerabilities in system security (e.g. code injection), including the potential for unauthorised parties to gain access to and control of the system and its data. vulnerabilities identified, especially those related to a potential loss of data integrity, should be addressed and mitigated in a timely manner.\n\nintrusion detection and prevention\n\nan effective intrusion detection and prevention system should be implemented on systems facing the internet in order to monitor the network for successful or unsuccessful intrusion attempts from external parties and for the design and maintenance of adequate information technology (it) security procedures.\n\ninternal activity monitoring\n\nan effective system for detecting unusual or risky user activities (e.g. shift in activity pattern) should be in place.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "24aabd37-8341-4cb0-9831-6d083cbc8591": {"__data__": {"id_": "24aabd37-8341-4cb0-9831-6d083cbc8591", "embedding": null, "metadata": {"page_label": "38", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Best Practices for Ensuring Data Security and Authentication in Clinical Trials", "questions_this_excerpt_can_answer": "1. What specific procedures should organizations managing clinical trial data implement to handle security incidents, especially in cases where data may have been compromised, according to the EMA Guideline on computerised systems and electronic data in clinical trials?\n\n2. How does the EMA Guideline recommend managing user authentication to ensure a high degree of certainty in identifying users, and what factors should determine the need for more stringent authentication methods in clinical trial data systems?\n\n3. What are the EMA Guideline's recommendations for the use of password managers in the context of accessing clinical trial data, and what risks are associated with password managers built into web browsers according to the document?", "prev_section_summary": "The section discusses the importance of IT security measures in clinical trials, focusing on managing unsupported platforms and operating systems, ensuring software compatibility with updates, controlling bi-directional devices, installing anti-virus software, conducting penetration testing, implementing intrusion detection and prevention systems, and monitoring internal user activities. Key entities include security patches, platforms, operating systems, software validation, bi-directional devices, anti-virus software, penetration testing, intrusion detection, and internal activity monitoring.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, data security"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## a4.11 security incident management\n\norganisations managing clinical trial data should have and work according to a procedure that defines and documents security incidents, rates the criticality of incidents, and where applicable, implements effective corrective and preventive actions to prevent recurrence. in cases where data have been, or may have been, compromised, the procedures should include ways to report incidents to relevant parties where applicable. when using a service provider, the agreement should ensure that incidents are escalated to the sponsor in a timely manner for the sponsor to be able to report serious breaches as applicable, in accordance with regulation (eu) no 536/2014.\n\n## a4.12 authentication method\n\nthe method of authentication in a system should positively identify users with a high degree of certainty. methods should be determined based on the type of information in the system. a minimum acceptable method would be user identification and a password. the need for more stringent authentication methods should be determined based on a risk assessment of the criticality of the data and applicable legislation (including data protection legislation), and generally should include two-factor authentication. user accounts should be automatically locked after a pre-defined number of successive failed authentication attempts, either for a defined period of time, or until they are re-activated by a system administrator after appropriate security checks. biometric approaches are currently not specifically addressed by ich e6. if using biometrics to authenticate the creation of a signature, the investigator and sponsor should ensure that these fulfil the above-mentioned requirements and local legal requirements.\n\n## a4.13 remote authentication\n\nremote access to clinical trial data, e.g. to cloud-based systems, raises specific challenges. the level of security should be proportionate to the sensitivity and confidentiality of the data (e.g. nominative data in electronic medical records are highly sensitive) and to the access rights to be granted (read-only, write or even admin rights). a risk-based approach should be used to define the type of access control required. depending on the level of risk, two-factor authentication may be appropriate or necessary. two-factor authentication implies that two of the following three factors be used:\n\n- something you know, e.g. a user identification and password\n- something you have, e.g. a security token, a certificate or a mobile phone and an sms pass code\n- something you are, e.g. a fingerprint or an iris scan (biometrics)\n\n## a4.14 password managers\n\na secure and validated password manager, with a unique, robust user authentication each time it is used to log into a web site or system, can help to create and use different, complex passwords for each site or system. however, attention should be paid to insufficiently secured password managers. password managers built into web browsers may save and automatically fill in user identification and passwords, regardless of whether an independent secure password manager is used or not. this poses a risk if uncontrolled equipment is used (e.g. personal equipment, shared equipment or user accounts), as user access control cannot be enforced; a risk that needs to be effectively mitigated. a policy or contractual arrangement would not be considered adequate to provide a sufficient level of security in such situations.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9dc52486-2238-4c12-bbba-6097b146b956": {"__data__": {"id_": "9dc52486-2238-4c12-bbba-6097b146b956", "embedding": null, "metadata": {"page_label": "39", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Enhancing Data Security Measures: A Comprehensive Guide for Protecting Sensitive Information", "questions_this_excerpt_can_answer": "1. What specific password policy recommendations does the EMA Guideline on computerised systems and electronic data in clinical trials provide to enhance data security in clinical trials?\n \n2. How does the guideline suggest clinical trial systems should handle user inactivity to maintain security, and what restrictions are placed on user capabilities regarding this feature?\n\n3. What strategies does the guideline recommend for protecting clinical trial data against unauthorized back-end changes, particularly in relation to database administrator access and data encryption methods?", "prev_section_summary": "The section discusses key topics related to data security and authentication in clinical trials, as outlined in the EMA Guideline on computerised systems and electronic data. It covers procedures for managing security incidents, the importance of authentication methods, considerations for remote access to data, and the use of password managers. Entities mentioned include user identification, passwords, two-factor authentication, biometrics, and the risks associated with password managers built into web browsers. The section emphasizes the need for organizations to have clear procedures in place, assess risks, and implement appropriate security measures to protect clinical trial data.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, data security"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nthe risk linked to the potential hacking of user equipment or to key loggers should also be considered.\n\n## password policies\n\nformal procedures for password policies should be implemented. the policies should include but not necessarily be limited to length, complexity, expiry, login attempts, and logout reset. the policies should be enforced by systems and verified during system validation.\n\n## password confidentiality\n\npasswords should be kept confidential, sharing of passwords is unacceptable and a violation of data integrity. passwords initially received from the system or from a manager or system administrator should be changed by the user on their first connection to the system. this should be mandated by the system.\n\n## inactivity logout\n\nsystems should include an automatic inactivity logout, which logs out a user after a defined period of inactivity. the user should not be able to set the inactivity logout time (outside defined and acceptable limits) or deactivate the functionality. upon inactivity logout, a re-authentication should be required (e.g. password entry).\n\n## remote connection\n\nwhen remotely connecting to systems over the internet, a secure and encrypted protocol (virtual private network (vpn) and/or hypertext transfer protocol secure (https)) should be used.\n\n## protection against unauthorised back-end changes\n\nthe integrity of data should be protected against unauthorised back-end changes made directly on a database by a database administrator. a method to prevent such changes could be by setting the application up to encrypt its data on the database or by storing data un-encrypted with an encrypted copy. in either case, the database administrator should not be identical to the administrator of the application.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "69a9221a-3e72-4b28-bd3c-93c44c134794": {"__data__": {"id_": "69a9221a-3e72-4b28-bd3c-93c44c134794", "embedding": null, "metadata": {"page_label": "40", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guidelines for Implementing Electronic Clinical Outcome Assessment Systems in Clinical Trials", "questions_this_excerpt_can_answer": "1. What specific considerations does the EMA Guideline on computerised systems and electronic data in clinical trials outline for the design and implementation of electronic patient reported outcome (ePRO) systems in clinical trials?\n\n2. How does the guideline address the issue of data viewing by trial participants in electronic patient reported outcome systems, and what are the recommended considerations for determining the period during which data are viewable by participants?\n\n3. According to the guideline, what measures should be taken to prevent unreasonable data changes, such as time travel, in electronic patient reported outcome systems, and how should these measures align with the protocol design?", "prev_section_summary": "The section discusses key measures for enhancing data security in clinical trials, focusing on password policies, password confidentiality, inactivity logout, remote connection security, and protection against unauthorized back-end changes. It emphasizes the importance of implementing formal procedures for password policies, maintaining password confidentiality, enforcing inactivity logout features, using secure remote connection protocols, and safeguarding data integrity against unauthorized database administrator changes.", "excerpt_keywords": "EMA, Guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 5 additional consideration to specific systems\n\nall computerised systems used in clinical trials should fulfil the requirements and general principles described in the previous sections. the following sub-sections define more specific wording for selected types of systems where the gcp inspectors working group (gcp iwg) has found that supplemental guidance is needed. for electronic trial master files (etmfs), please refer to the respective guideline 1.\n\n### a5.1 electronic clinical outcome assessment\n\nelectronic clinical outcome assessment (ecoa) employs technology in addition to other data acquisition tools for the reporting of outcomes by investigators, trial participants, care givers and observers. this guideline does not address the clinical validation or appropriateness of particular ecoa systems. the guideline aims at addressing the topics specifically related to these ecoa systems and also to those related to the situation where bring-your-own-device (byod) solutions are used.\n\ndata can be collected by any of several technologies and will be transferred to a server. data should be made available to involved/responsible parties such as the investigator e.g. via portals, display of source data on the server, generation of alerts and reports. these processes should be controlled and clearly described in the protocol (high-level) and protocol-related documents, and all parts of the processes should be validated.\n\ncollecting data electronically may offer more convenience to some trial participants and may increase participant compliance, data quality, reduce variability, reduce the amount of missing data (allowing automatic reminders) and potentially reduce data entry errors. of importance, whilst use of such measures might be of benefit to some trial participants and patient groups, it may be inconvenient for or even result in the exclusion of others. this should be considered when using any data acquisition tool and the choice should be justified.\n\n#### a5.1.1 electronic patient reported outcome\n\n##### a5.1.1.1 system design\n\nelectronic patient reported outcome (epro) should be designed to meet the specific needs of the end users. it is recommended to involve representatives of intended site staff and of the intended trial participant population, where relevant, in the development and testing.\n\none of the advantages of using an epro system is that the timestamps of data entry are recorded. the timestamp should record the time of the data entry and not only the time of the data submission/transmission.\n\ntrial participants should be able to view their own previously entered data, unless justified and unless it is against the purpose of the clinical trial design or the protocol. therefore, the period that data are viewable by the participant should be considered when designing/configuring the epro. decisions about the view-period should be based on considerations regarding risk for bias on data to be entered. if viewing of recently entered data is not possible by the participant, then there is a risk that the participant could forget if relevant data have been collected. this is especially the case if the planned entry is event-driven. in addition, this prevents an unnecessary burden to site staff, as they will be contacted by trial participants in case of doubt less often.\n\nlogical checks should be in place to prevent unreasonable data changes such as time travel e.g. going back (months, years in time) or forward into the future based on the protocol design.\n\n1guideline on the content, management and archiving of the clinical trial master file (paper and/or electronic) (ema/ins/gcp/856758/2018).", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1a2a9bd3-de45-400c-8ee5-b8d8377eb8bd": {"__data__": {"id_": "1a2a9bd3-de45-400c-8ee5-b8d8377eb8bd", "embedding": null, "metadata": {"page_label": "41", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Enhancing Data Collection and Compliance with Electronic Patient-Reported Outcome (ePRO) Systems", "questions_this_excerpt_can_answer": "1. How does the EMA guideline suggest handling data corrections within ePRO systems to ensure data integrity and compliance with ICH E6 standards?\n \n2. What specific measures does the EMA guideline recommend for preventing data loss from ePRO devices, especially in scenarios where web access is interrupted or unreliable?\n\n3. According to the EMA guideline, how should ePRO data be managed differently from data collected in electronic case report forms (eCRF) in terms of investigator access and responsibility for trial participant data oversight?", "prev_section_summary": "This section discusses the guidelines for implementing Electronic Clinical Outcome Assessment (eCOA) systems in clinical trials. It covers the design considerations for Electronic Patient Reported Outcome (ePRO) systems, including the involvement of end users in system design, the importance of recording timestamps for data entry, and the ability for trial participants to view their entered data. The section also emphasizes the need for logical checks to prevent unreasonable data changes, such as time travel, in ePRO systems. Additionally, it highlights the benefits of using eCOA systems, such as increased participant compliance, data quality, and reduced data entry errors.", "excerpt_keywords": "EMA guideline, electronic data, clinical trials, ePRO systems, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nit should be considered to include a scheduling/calendar component with alerts or reminders to assist compliance.\n\n## data collection and data transfer\n\nthe same ich e6 standards apply to data collected via epro as to any other method of data collection, i.e. that there are processes in place to ensure the quality of the data, and that all clinical information is recorded, handled and stored in such a way as to be accurately reported, interpreted and verified.\n\nan epro system typically requires an entry device. data saved on the device is the original record created by the trial participant. since the data stored in a temporary memory are at higher risk of physical loss, it is necessary to transfer the data to a durable server at an early stage, by a validated procedure and with appropriate security methods during data transmission. data should be transferred to the server according to a pre-defined procedure and at pre-defined times. the data saved on the device are considered source data. after the data are transferred to the server via a validated procedure, the original data can be removed from the device as the data on the server are considered certified copies. the sponsor should identify the source data in the protocol and protocol-related documents and should document the time and locations of source data storage.\n\nin addition to the general requirements on audit trails (please refer to section 6.2.), if an epro system is designed to allow data correction, the data corrections should be documented, and an audit trail should record if the data saved on the device are changed before the data are submitted.\n\ndata loss on devices should be avoided. procedures should be in place to prevent data loss if web access to the trial participant reported data is interrupted, (e.g. server outage, device battery drained, loss of or unstable internet connection). there should be a procedure in place to handle failed or interrupted data transmission.\n\nit should be ensured/monitored that the transmission of data from epro devices is successfully completed.\n\nimportant actions should be time-stamped in an unambiguous way, e.g. data entries, transfer times and volume (bytes).\n\n## investigator access\n\nunlike data collected in the electronic case report form (ecrf), epro data are not managed (although available for review) by the investigator and are often hosted by a service provider. the investigator is overall responsible for the trial participants data (including metadata). those should consequently be made available to the investigator in a timely manner. this will allow the investigator to fulfil their responsibilities for oversight of safety and compliance and thereby minimise the risk of missed adverse events or missing data.\n\n## data changes\n\nas stated in section 6.2.1. on audit trails, a procedure should be in place to address and document if a data originator (e.g. investigator or trial participant) realises that they have submitted incorrect data by mistake and want to correct the recorded data.\n\ndata changes for epro typically differ from that of other data acquisition tools because trial participants typically do not have the possibility to correct the data in the application. hence, procedures need to be in place in order to implement changes when needed. this depends on the design of tools and processes and could be in the form of data clarification processes initiated by trial participants on their own reported data or initiated by investigators.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a0220bf7-7e3d-4364-9ed7-f8f7d74000d8": {"__data__": {"id_": "a0220bf7-7e3d-4364-9ed7-f8f7d74000d8", "embedding": null, "metadata": {"page_label": "42", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guidelines for Data Reporting and Accountability in Clinical Trials", "questions_this_excerpt_can_answer": "1. What measures are recommended by the EMA Guideline to ensure the reliability and accuracy of data reported in clinical trials, especially in relation to changes in trial participant data?\n \n2. How does the EMA Guideline propose to manage the accountability of devices distributed to trial participants in clinical trials, including the tracking and reconciliation of these devices?\n\n3. What specific strategies does the EMA Guideline suggest for maintaining data integrity and participant confidentiality in the context of using electronic patient-reported outcomes (ePROs) and Bring Your Own Device (BYOD) policies in clinical trials?", "prev_section_summary": "The section discusses the use of Electronic Patient-Reported Outcome (ePRO) systems in clinical trials, focusing on data collection, data transfer, investigator access, and data changes. Key topics include the importance of ensuring data integrity and compliance with ICH E6 standards, the need for secure data transfer from entry devices to durable servers, procedures to prevent data loss on devices, the role of investigators in overseeing trial participant data, and the handling of data corrections in ePRO systems. Entities mentioned include trial participants, investigators, service providers, source data, certified copies, audit trails, data corrections, and data clarification processes.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## data reporting and accountability guidelines\n\ndata reported should always be reliable. procedures for data clarification introduced by the sponsor or service provider should not prevent changes in trial participant data when justified. changes should be initiated in a timely manner based on a solid source of information.\n\ndirect data entry by trial participants helps minimize recall bias. corrections should be made promptly with proper justification. all clinical data, whether collected on paper or electronically, must be accurately reported and verifiable in relation to clinical trials.\n\nthe number of changes to epro data should be limited. proper design of epros and training of trial participants are essential to avoid entry errors.\n\n### accountability of devices\n\nan accountability log of devices given to trial participants should include device identification numbers to reconcile with specific participants.\n\n### contingency processes\n\ncontingency processes must be in place to prevent loss of critical data for participant safety or trial results. procedures should be established for device malfunction or loss, including replacing devices and merging data without losing traceability.\n\n### username and password\n\ntrial participants passwords should be confidential. usernames and passwords should not compromise participant confidentiality. basic user access controls should be implemented for byod, with mobile applications requiring access controls for attributability.\n\n### training\n\ntraining should be customized to meet the specific needs of end users.\n\n### user support\n\nsupport for trial participants and site staff should be readily available to ensure reliable data and minimize data loss. confidentiality must be maintained throughout the communication process. procedures for service desk, user authentication, and access restoration should be in place.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a4d12606-d0a2-4bc5-9d1b-806fd23650e5": {"__data__": {"id_": "a4d12606-d0a2-4bc5-9d1b-806fd23650e5", "embedding": null, "metadata": {"page_label": "43", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Exploring the Use of BYOD in Clinician Reported Outcome Data Collection: Considerations and Best Practices\"", "questions_this_excerpt_can_answer": "1. What specific considerations should be taken into account when implementing Bring Your Own Device (BYOD) strategies for collecting clinician reported outcome data and ePRO data in clinical trials, according to the EMA guidelines?\n \n2. How does the EMA guideline suggest ensuring the integrity and quality of data collected through BYOD in clinical trials, especially considering the variety of devices and operating systems used by trial participants?\n\n3. What are the recommended procedures for maintaining the confidentiality and security of trial participant data in BYOD setups for clinical trials, as outlined in the EMA guideline on computerised systems and electronic data?", "prev_section_summary": "The section discusses guidelines for data reporting and accountability in clinical trials, focusing on ensuring the reliability and accuracy of reported data, managing the accountability of devices distributed to trial participants, maintaining data integrity and participant confidentiality in the context of electronic patient-reported outcomes (ePROs) and Bring Your Own Device (BYOD) policies. Key topics include data clarification procedures, direct data entry by trial participants, limited changes to ePRO data, accountability of devices, contingency processes for device malfunction or loss, confidentiality of usernames and passwords, customized training for end users, and user support for reliable data collection. Key entities mentioned include trial participants, devices, data, usernames, passwords, training, and user support.", "excerpt_keywords": "EMA guideline, BYOD, clinician reported outcome, ePRO data, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## clinician reported outcome\n\ntools to directly collect clinician reported outcomes should generally follow the same requirements as those described for systems in general and for epros. the main difference is the user (investigators, other clinicians, or independent assessors instead of trial participants), not the system requirements. special attention should be given to access control in order to avoid jeopardising any blinding, when relevant.\n\n## bring your own device\n\nboth epro data and clinician reported outcome data may be captured by privately owned devices such as mobile phones, tablets, computers and wearables, i.e. byod. this can either be achieved via a web-application with pre-installed browser applications or by installing an application on the device. solutions can be either a combination of web and application (hybrid) or coded to the device operating system (native).\n\nit is necessary to provide alternative ways of data collection e.g. devices provided by the sponsor, as the trial participants should not be excluded from a trial if not capable of or willing to use byod.\n\n### technical and operational considerations\n\nwhen using byod, a variety of devices, operating systems and where applicable web browsers commonly used, should be considered for the application. it should be ensured that it is not exclusive to one model or operating system.\n\nthe sponsor should describe the minimum technical specifications for participants devices (e.g. operating system, web browser and storage capacity). these should take into account which operating systems are still supported by the manufacturer and if bug fixes and security patches have been released, when relevant.\n\nthe sponsor should ensure the quality and integrity of the data across all accepted models and versions. the sponsor has no control over the implementation of updates to the operating system or over the applications on the trial participants device. these aspects should be taken into consideration in their risk evaluation and subsequent validation activities.\n\nthe application should use an external source for date and time and should not rely on information from the users device.\n\nprocedures and processes should be in place for when the trial participant discontinues the clinical trial or the clinical trial ends and access to applications and data collection should be terminated.\n\n### considerations on security and trial participant confidentiality\n\nthe confidentiality of data that could identify trial participants should be protected, respecting the privacy and confidentiality rules in accordance with the applicable regulatory requirements.\n\na number of challenges for byod are related to security, and security should be ensured at all levels (mobile device security, data breach security, mobile application security, etc.). as mobile devices may be lost or stolen and it cannot be ensured that the trial participants use any authentication methods to secure their device, access control should be at the application level. section a.4.14 on the use of password managers also applies.\n\nrisks linked to known application and operating system vulnerabilities should be minimised.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "232b1f5a-65f0-4d62-8b30-4400fd6e026b": {"__data__": {"id_": "232b1f5a-65f0-4d62-8b30-4400fd6e026b", "embedding": null, "metadata": {"page_label": "44", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Securing and Managing BYOD Devices in Clinical Trials: Ensuring Security and Proper Usage\"", "questions_this_excerpt_can_answer": "1. What specific measures are recommended to ensure the confidentiality and security of trial participant data when using BYOD (Bring Your Own Device) devices in clinical trials, according to the EMA Guideline on computerised systems and electronic data?\n \n2. How does the document address the collection and use of sensitive data, such as location data, through mobile applications in clinical trials, and what are the requirements for obtaining trial participant consent for such data collection?\n\n3. What are the guidelines for the installation, support, and uninstallation of applications on BYOD devices in clinical trials to ensure they do not interfere with the device's normal operations or compromise the device upon uninstallation, as outlined in the EMA Guideline?", "prev_section_summary": "The section discusses the use of Bring Your Own Device (BYOD) in collecting clinician reported outcome data and ePRO data in clinical trials, as outlined in the EMA guidelines. It covers considerations for implementing BYOD strategies, technical and operational considerations, security and confidentiality measures, and the importance of ensuring data integrity and quality across various devices and operating systems. Key topics include access control, alternative data collection methods, technical specifications for participant devices, data quality and integrity, confidentiality protection, security measures, and risk minimization related to vulnerabilities in applications and operating systems.", "excerpt_keywords": "BYOD, clinical trials, electronic data, data security, mobile applications"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## the hardware, operating system and applications are all factors that affect the total security status of the device, and there should be procedures in place regarding e.g., when trial participants/clinicians use less secure devices.\n\ndata capture by byod may require the device to be identified to ensure data attributability. only information that is needed for proper identification of and service to the user should be obtained. trial participant confidentiality should be ensured if device identification information is stored. access to the application and trial participant data may be protected with multiple barriers (e.g. unlock mobile phone, open application, access data).\n\nif the devices built-in capabilities for auto fill formula data and/or using photo, video, and global positioning system (gps) data, etc. are used, this should be described and justified in the protocol. procedures and processes should ensure that only protocol mandated data are collected, and that the confidentiality of data is maintained. in accordance with the principle of data minimisation mobile applications should only collect data that are necessary for the purposes of the data processing and not access any other information on the persons device. for example, location data should only be collected if it is necessary for the clinical trial activities and the trial participant must be informed about it in the patient information and agree to it in the consent form.\n\nproviders may have end-user licensing agreements or terms of service that allow the sharing of data. this may be in conflict with ich e6 and (local) legal requirements or require information to be provided to the participant and may require specific informed consent. in some cases, the application may not be suitable for use. if an application is to be installed on a byod, the privacy labels/practices (e.g. regarding tracking data, linked and not linked data) should be clearly communicated to the trial participant upfront. the sponsor should be aware that explicit consent may be required related to the above. the informed consent should describe the type of information that will be collected via epro and how that information will be used.\n\n## installation and support\n\nwhen using an application, it is recommended that appropriately trained staff assist in the installation even if the application is available through an app-store or service provider platform. independently of whether the byod solution is based on an application installed on the device or a website/web application, the software and the use should be explained thoroughly via targeted training, which may include user manuals, one-to-one training, and multimedia tools. users of the system should have access to user support e.g. from a help desk. there should be a procedure in place in case an application cannot be installed, or the web service is unavailable on a device, if the device has malfunctioned or the participant has purchased a new device. helpdesk contacts by users should be logged (participant or site staff study id, purpose of contact, etc.) with due consideration of protecting participant information.\n\nthe software and software installation should not limit or interfere with the normal operations of the device. any unavoidable limitation to the device after installation should be part of the informed consent material.\n\n## uninstallation\n\nit should be possible to uninstall software or applications without leaving residues on byod devices, e.g. entries in the registry, incorrect mappings or file fragments. the user should be able to uninstall at any time without expertise or assistance. the uninstallation process should not compromise the device.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3703639c-2f7b-4e1a-8580-fe44bd0c3061": {"__data__": {"id_": "3703639c-2f7b-4e1a-8580-fe44bd0c3061", "embedding": null, "metadata": {"page_label": "45", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Testing and Compliance Considerations for Interactive Response Technology System in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific considerations should sponsors take into account when writing test scripts for User Acceptance Tests (UAT) related to dosage calculations in Interactive Response Technology (IRT) systems for clinical trials, as outlined in the EMA Guideline?\n\n2. How does the document recommend handling the process of emergency unblinding within Interactive Response Technology systems in clinical trials, including the provision for a backup process?\n\n3. What are the guidelines for integrating clinical data collected via an Interactive Response Technology system with an Electronic Data Collection (EDC) system, particularly concerning data acquisition tools and investigator responsibilities, as specified in the document?", "prev_section_summary": "The section discusses the security measures and guidelines for managing BYOD devices in clinical trials, focusing on ensuring confidentiality and proper usage of trial participant data. Key topics include device security factors, data capture and identification, collection of sensitive data through mobile applications, installation and support of applications on BYOD devices, and proper uninstallation procedures. Entities mentioned include trial participants, clinicians, mobile applications, software, user support, informed consent, and device operations.", "excerpt_keywords": "Clinical trials, Interactive Response Technology, User Acceptance Tests, Emergency unblinding, Electronic Data Collection"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## a5.2 interactive response technology system\n\na5.2.1 testing of functionalities\n\nin addition to the content of the sections a2.6, a2.10, of this guideline, sponsors should also consider the issues mentioned below when writing test scripts for user acceptance tests (uat).\n\na5.2.1.1 dosage calculations\n\nwhere dosage calculations/assignments are made by the irt system based on user entered data (e.g., trial participant body surface area or weight), and look-up tables (dosage assignment based on trial participant parameters), the tables should be verified against the approved protocol and input data used to test allocations, including test data that would be on a borderline between differing doses. assigning the incorrect dosage to a trial participant is a significant risk to safety and well-being and such inaccurate assignments should be thoroughly mitigated.\n\na5.2.1.2 stratified randomisation\n\nwhere the randomisation is stratified by factors inputted by the user, all the combinations of the strata should be tested to confirm that the allocation is occurring from the correct randomisation table.\n\na5.2.1.3 blinding and unblinding\n\nunblinded information should only be provided and accessible to pre-identified user roles.\n\na5.2.2 emergency unblinding\n\nthe process for emergency unblinding should be tested. a backup process should also be in place in case the online-technology emergency unblinding is unavailable. it should be verified that a sites ability for emergency unblinding is effectively available before administering imp to a trial participant.\n\na5.2.3 irt used for collection of clinical data from the trial site\n\nwhere the irt system is collecting clinical data, important data should be subject to source data verification and/or reconciliation with the same data collected in the data acquisition tool. for example, the data used for stratification may also be contained in the data acquisition tool. where clinical data is entered into the irt system and integrated in the electronic data collection (edc) system (electronic data transfer to edc) the additional functionality and ich e6 requirement concerning data acquisition tools (ecrfs) should be addressed in the irt system requirements and uat e.g. investigator control of site entered data, authorisation of data changes by the investigator, authorisation of persons entering/editing data in the system by the investigator.\n\na5.2.4 web-based randomisation\n\nwhere justified, sponsor or investigator/sponsor may also use a web-based application to create randomisation lists for clinical trials. when using a web-service, the process to evaluate the suitability of the system and gcp compliance as well as the fitness for purpose of the created randomization list should be documented. the version of the service used, and where applicable, the seed should be maintained. ad hoc randomization via a web-service is not recommended as randomization distribution is unknown, the sponsor is not in control of the process e.g. the seed may vary.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9ec0d2c0-d4d4-4d02-9cdb-293eeaaf4e53": {"__data__": {"id_": "9ec0d2c0-d4d4-4d02-9cdb-293eeaaf4e53", "embedding": null, "metadata": {"page_label": "46", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Electronic Informed Consent and Provision of Information in Clinical Trials: A Comprehensive Guide", "questions_this_excerpt_can_answer": "1. What specific steps should a sponsor take to ensure that an electronic informed consent procedure is compliant with Good Clinical Practice (GCP) and legally acceptable before its implementation in a clinical trial?\n \n2. How does the document suggest handling situations where national regulations may not fully support the implementation of an entirely electronic informed consent process in clinical trials?\n\n3. What considerations are recommended for ensuring that trial participants fully understand the nature and implications of the clinical trial when information is provided electronically, and how should confidentiality be maintained during this process?", "prev_section_summary": "This section discusses testing considerations for Interactive Response Technology (IRT) systems in clinical trials. Key topics include testing functionalities such as dosage calculations, stratified randomisation, blinding and unblinding processes, emergency unblinding procedures, integration of clinical data with Electronic Data Collection (EDC) systems, and web-based randomisation. Entities mentioned include trial participants, user roles, investigators, and sponsors. The importance of verifying data accuracy, ensuring safety, and complying with regulatory requirements is emphasized throughout the section.", "excerpt_keywords": "Electronic Informed Consent, Clinical Trials, GCP Compliance, National Regulations, Confidentiality"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nthe sponsor should ensure that the process of randomisation can be reconstructed via retained documentation and data and that a final randomisation schedule is retained.\n\nelectronic informed consent\n\nethics committees will review all material related to the informed consent process. before the implementation of an electronic consent procedure is considered, the sponsor should ensure that the electronic consent procedure is gcp compliant and legally acceptable in accordance with the requirements of the independent ethics committees concerned and of the national regulatory authorities. the principles of consent as set out in legislation and guidance should be the same regardless of whether the process involves a computerised system. a hybrid approach could be considered, where national requirements preclude certain parts of an electronic informed consent procedure. at present, in some countries failure to provide written on paper proof of a trial participants informed consent is considered a legal offense.\n\nan electronic informed consent refers to the use of any digital media (e.g. text, graphics, audio, video, podcasts or websites) firstly to convey information related to the clinical trial to the trial participant and secondly to document informed consent via an electronic device (e.g. mobile phones, tablets or computers). the electronic informed consent process involves electronic provision of information, the procedure for providing the opportunity to inquire about details of the clinical trial including the answering of questions and/or electronic signing of informed consent. for example, it would be possible for the trial participant to sign informed consent on a paper form following provision of the information electronically or the information and informed consent could be entirely electronic. if using a wet ink signature together with an electronic informed consent document (a hybrid approach), the patient information, the informed consent document and the signature should be indisputably linked.\n\nthe method of obtaining an informed consent should ensure the broadest possible access to clinical trials. alternative methods for provision of information and documentation of informed consent should be available for those unable or unwilling to use electronic methods. any sole use of electronic informed consent should be justified and described in the protocol.\n\nprovision of information about the clinical trial\n\nthe trial participants should have been informed of the nature, objectives, significance, implications, the expected benefit, risks, and inconveniences of the clinical trial in an interview with the investigator, or another member of the investigating team delegated by the principal investigator. the interview should take into account the individual disposition (e.g. comorbidities, patient references, etc.) of the potential participant (or legal representative). this interview should allow interaction, the asking of questions and allow confirmation of the trial participants identity and not just simply the provision of information. the interview should be conducted in person or, it could be done remotely where this can be justified and is allowed nationally and if approved by an ethics committee using electronic methods that allow for two-way communication in real time. whichever method is used it is important that confidentiality is maintained, and therefore communication methods should be private/secure. consideration should be given as to how the system would be presented to the ethics committee for approval so that it captures the functionality of the system and the experience of the potential trial participant using it. direct system access should be provided to the ethics committee upon request in a timely manner.\n\nprovision of the information electronically may improve the trial participants understanding of what taking part in the clinical trial will involve. computerised systems could facilitate features to assess the", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "60646b7f-bc76-4062-996a-b313b600fb71": {"__data__": {"id_": "60646b7f-bc76-4062-996a-b313b600fb71", "embedding": null, "metadata": {"page_label": "47", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Ensuring Compliance and Confidentiality in Informed Consent and Trial Participant Identity Verification: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. What specific measures are outlined in the EMA Guideline for ensuring the authenticity and integrity of electronic signatures in the informed consent process of clinical trials?\n \n2. How does the EMA Guideline address the verification of trial participant identity, especially in scenarios where consent is given remotely but the participant is later required to visit a clinical trial site?\n\n3. According to the EMA Guideline, what protocols should be in place to protect the confidentiality of trial participants' data and ensure compliance with national and EU regulatory requirements regarding privacy?", "prev_section_summary": "The section discusses the process of electronic informed consent in clinical trials, including the need for compliance with Good Clinical Practice (GCP) and legal requirements. It mentions the use of digital media to convey information and document consent, as well as the importance of maintaining confidentiality during this process. The section also covers the provision of information about the clinical trial to participants, emphasizing the need for interaction, two-way communication, and consideration of individual circumstances. Additionally, it suggests alternative methods for providing information and obtaining consent for those who are unable or unwilling to use electronic methods.", "excerpt_keywords": "EMA Guideline, computerised systems, electronic data, clinical trials, informed consent"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nparticipants understanding e.g. via questions at key points, which self-evaluate trial participants understanding as they work their way through the information. this, in turn, can be used to highlight areas of uncertainty to the person seeking consent so that they can cover this area in more detail with the trial participant.\n\na5.3.2 written informed consent\n\nthe informed consent of the trial participant should be in writing and electronic methods for documenting the trial participants informed consent should ensure that the informed consent form is signed and personally dated by at least two (natural) persons; the trial participant or the trial participants legal representative, and the person who conducted the informed consent discussion. the identity of the persons signing should be ensured.\n\nthe method used to document consent should follow national legislation with regard to e.g. acceptability of electronic signatures (see section 4.8.), and in some countries wet ink signature will be required. there should be no ambiguity about the time of signature. the system should use timestamps for the audit trail for the action of signing and dating by the trial participant and investigator or qualified person who conducted the informed consent interview, which cannot be manipulated by system settings. any alterations of the document should invalidate the electronic signature.\n\nif an electronic signature is used, it should be possible for monitors, auditors, and inspectors to access the signed informed consent forms and all information regarding the signatures, including the audit trail. secure archiving should ensure availability and legibility for the required retention period.\n\na5.3.3 trial participant identity\n\nit should always be possible to verify the identity of a trial participant with documentation available to the investigator. documentation which makes it possible to demonstrate that the person entering the electronic signature was indeed the signatory, is required. the electronic signing should be captured by the audit trail.\n\nwhere consent is given remotely, and the trial participant is required at some point to visit a clinical trial site for the purposes of the trial, verification should be done in person e.g. by using information from an official photo identification if such an id document is required in the trial site country.\n\na5.3.4 sponsor notification on the consent process\n\nnotification to the sponsor should only contain essential, non-personal identifiable information to allow the sponsor to have an overview of how many trial participants have been enrolled in a clinical trial so far and which versions of the electronic informed consent form have been used. remote access to personal identifiable information in the electronic system should only be permitted for the corresponding participant, legal representative, investigator, monitor, auditor, or inspector. any unjustified accesses, which lead to the disclosure of non-pseudonymised information, are likely to be viewed as an infringement of data privacy laws.\n\na5.3.5 trial participant confidentiality\n\nas for all other computerised systems in clinical trials, the confidentiality of data that could identify trial participants should be protected, respecting the privacy and confidentiality rules in accordance with applicable national and eu regulatory requirements.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "8727a5c1-1679-4734-ac27-0aa0eb127c94": {"__data__": {"id_": "8727a5c1-1679-4734-ac27-0aa0eb127c94", "embedding": null, "metadata": {"page_label": "48", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Clinical Trial Participant Access and Consent Process: A Guide for Candidates", "questions_this_excerpt_can_answer": "1. What specific steps should an investigator take to ensure the confidentiality and understanding of informed consent documentation by trial participants, according to the EMA Guideline on computerised systems and electronic data in clinical trials?\n\n2. How does the EMA Guideline on computerised systems and electronic data in clinical trials address the issue of version control and approval for electronic informed consent documents during a clinical trial?\n\n3. What are the requirements and procedures outlined in the EMA Guideline for handling a trial participant's withdrawal from a clinical trial through a computerised system, including the generation of alerts to the investigator?", "prev_section_summary": "This section discusses the importance of ensuring compliance and confidentiality in the informed consent process and trial participant identity verification in clinical trials. Key topics include measures for ensuring the authenticity and integrity of electronic signatures, written informed consent requirements, verification of trial participant identity, sponsor notification on the consent process, and trial participant confidentiality. Entities mentioned include trial participants, legal representatives, investigators, monitors, auditors, inspectors, and sponsors. The section emphasizes the need for secure documentation, verification of identity, and protection of data confidentiality in accordance with national and EU regulatory requirements.", "excerpt_keywords": "Clinical trials, Informed consent, Electronic data, Trial participant access, Withdrawal process"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## a5.3.6 trial participant access\n\npotential trial participants (or, where applicable, their legal representative) should be provided with access to written information about the clinical trial prior to seeking their informed consent. the trial participant should be provided with their own copy of the informed consent documentation (including all accompanying information and all linked information) once their consent has been obtained. this includes any changes to the data (documents) made during the process.\n\nthe information about the clinical trial should be a physical hard copy or electronic copy in a format that can be downloaded. the copy should be available immediately to the trial participant.\n\n## a5.3.7 investigator responsibilities\n\nthe investigator should take appropriate measures to verify the identity of the potential trial participant (see section a5.3.3) and ensure that the participant has understood the information given. the informed consent documents are essential documents that should be available at the trial site in the investigator tmf for the required retention period (see section a5.3.9). the investigator should retain control of the informed consent process and documentation (e.g. signed informed consent forms) and ensure that personal identifiable data are not inappropriately disclosed beyond the site. the system used should not limit the investigators ability to ensure that trial participants confidentiality is protected with appropriate access and retention controls in the system. the investigator should ensure an appropriate process for the copy of the informed consent documentation (information sheet and signed consent form) to be provided to the trial participant. all versions of signed and dated electronic consents should be available to the trial participant for the duration of and after the trial. the system used should ensure that the investigator can grant and revoke access to the electronic informed consent system to monitors, auditors and regulatory authority inspectors.\n\n## a5.3.8 version control and availability to sites\n\nthe electronic informed consent information (electronic trial participant information and informed consent form) may be subject to updates and changes during the course of the trial. regardless of the nature of the change or update, the new version containing relevant information has to receive the favourable opinion/approval of the ethics committee(s) prior to its use. additional information should be made available to the ethics committee(s) concerning technical aspects of the electronic informed consent procedure to ensure continued understanding of the informed consent processes. only versions approved by the ethics committee(s) should be enabled and used for the informed consent process and documentation. release of electronic trial participant information and informed consent forms to the sites prior to irb/iec approval should be prevented. the system should prevent the use of obsolete versions of the information and informed consent document.\n\n## a5.3.9 availability in the investigators part of the trial master file\n\nall documents of the informed consent procedure (including all accompanying information and all linked information) are considered to be essential documents and should be archived as such. replacement of the documents with copies is only acceptable if the copies are certified copies (see section 6.5.).\n\n## a5.3.10 withdrawal from the trial\n\nthere should be procedures and processes in place for a trial participant to be able to withdraw their consent. if there is a possibility for the trial participant to withdraw from the trial through the computerised system, it should be ensured that such a withdrawal of consent generates an alert to the investigator in order to initiate the relevant steps as per protocol and according to the extent of", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "71a9a067-21f8-4c98-b3af-1faed64fcaae": {"__data__": {"id_": "71a9a067-21f8-4c98-b3af-1faed64fcaae", "embedding": null, "metadata": {"page_label": "49", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Data Protection and Consent Withdrawal Policy", "questions_this_excerpt_can_answer": "1. What is the policy on the impact of informed consent withdrawal on previously collected data in clinical trials as outlined in the EMA Guideline on computerised systems and electronic data?\n \n2. According to the document titled \"Data Protection and Consent Withdrawal Policy\" from the EMA Guideline, how does the withdrawal of informed consent affect the storage and use of data obtained prior to the withdrawal in clinical trials?\n\n3. In the context of the EMA Guideline on computerised systems and electronic data in clinical trials, what measures are outlined for handling data obtained from participants who later withdraw their informed consent?", "prev_section_summary": "The section discusses the access and responsibilities of trial participants and investigators in clinical trials, focusing on informed consent documentation. Key topics include providing trial participants with access to information, ensuring understanding of informed consent, maintaining version control and approval for electronic documents, and handling participant withdrawals from the trial. Entities mentioned include trial participants, investigators, ethics committees, electronic informed consent information, trial master file, and withdrawal procedures.", "excerpt_keywords": "Data Protection, Consent Withdrawal, Informed Consent, Clinical Trials, Electronic Data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nwithdrawal. any withdrawal of informed consent should not affect the results of activities already carried out, such as the storage and use of data obtained on the basis of informed consent before withdrawal.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "13cb69d2-b0f9-4694-a116-1b5949a40b38": {"__data__": {"id_": "13cb69d2-b0f9-4694-a116-1b5949a40b38", "embedding": null, "metadata": {"page_label": "50", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Guidelines for Implementing Clinical Systems in Clinical Trials", "questions_this_excerpt_can_answer": "1. What specific considerations should be taken into account when an institution plans to use existing electronic medical records or other computerized systems for clinical trial purposes, according to the EMA guidelines?\n \n2. How does the EMA guideline recommend sponsors assess the computerized systems used by an investigator or institution for their suitability in clinical trials, particularly concerning the system's ability to ensure the rights, safety, dignity, and well-being of trial participants?\n\n3. What does the EMA guideline suggest regarding the documentation of medical oversight by the investigator when electronic medical records are utilized in clinical trials, especially in scenarios where data entry is performed by research nurses or dedicated data entry staff?", "prev_section_summary": "The section discusses the policy outlined in the EMA Guideline on computerised systems and electronic data in clinical trials regarding the impact of informed consent withdrawal on previously collected data. It emphasizes that any withdrawal of informed consent should not affect the storage and use of data obtained prior to the withdrawal in clinical trials. The key topics include data protection, consent withdrawal policy, handling of data obtained from participants who withdraw consent, and the importance of maintaining the integrity of data collected before consent withdrawal.", "excerpt_keywords": "EMA guideline, computerised systems, electronic data, clinical trials, data integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\n## annex 6 clinical systems\n\nas stated in sections 2. and 4.6., computerised systems implemented at the trial site are also within the scope of this guideline, and the general approach towards computerised systems used in clinical practice is that the decision to use a system in a clinical trial should be risk proportionate and justified pre-trial. this section is dedicated to specific and additional considerations regarding electronic medical records and other systems implemented at sites, which are primarily used in clinical practice but are also generating clinical trial data. for computerised systems built specifically for data collection in clinical trials please refer to the relevant sections of this guideline.\n\n### a6.1 purchasing, developing, or updating computerised systems by sites\n\nthe investigator/institution should have adequate facilities for a clinical trial. this also applies to the computerised systems of the institution if considered to be used for clinical trial purposes. it is recommended that institutions planning to perform clinical trials consider whether system functionality is fit for the clinical trial purpose. this should also be considered prior to the introduction of a new electronic medical record or equipment planned to be used in clinical trials (e.g. scanners, x-ray, electrocardiograms), or prior to changes to existing systems. to ensure that system requirements related to gcp compliance (e.g. audit trail for an electronic medical record) are addressed, experienced clinical trial practitioners should be involved by the institution in the relevant steps of the procurement and validation processes. as many systems are designed with different configuration options, it should be ensured that the systems are configured in a gcp compliant manner.\n\n### a6.2 site qualification by the sponsor\n\nas part of the site qualification, the sponsor should assess the systems in use by the investigator/institution to determine whether the systems are fit for their intended use in the clinical trial (e.g. include an audit trail). the assessment should cover all computerised systems used in the clinical trial and should include consideration of the rights, safety, dignity and wellbeing of trial participants and the quality and integrity of the trial data. if the systems do not fulfil the requirements, the sponsor should consider whether to select the investigator/institution. the use of systems not fulfilling requirements should be justified, either based on planned implementation of effective mitigating actions or a documented impact assessment of residual risks.\n\n### a6.3 training\n\nif the use of the systems in the context of a specific trial is different from the use in clinical practice e.g. different scanning procedures, different location of files, different requirements regarding documentation etc., trial specific training is required.\n\n### a6.4 documentation of medical oversight\n\nthe investigator should be able to demonstrate their medical oversight of the clinical trial when electronic medical records are used. where all or part of the entries into the medical records are made by a research nurse/dedicated data entry staff it can be difficult to reconstruct the investigators input. the system", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "880c70bd-2a15-465f-878b-a8f1b53fdafc": {"__data__": {"id_": "880c70bd-2a15-465f-878b-a8f1b53fdafc", "embedding": null, "metadata": {"page_label": "51", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "Ensuring Data Security and Access Management in Clinical Trials: Best Practices and Guidelines", "questions_this_excerpt_can_answer": "1. What specific procedures must be in place at clinical trial sites to ensure the confidentiality of trial participants' identities when electronic medical records are shared with sponsors or service providers?\n \n2. How does the document address the management of user access and the importance of secure and attributable access in clinical trials, especially in scenarios where trial information could potentially unblind the treatment?\n\n3. What are the guidelines for granting sponsor representatives, such as monitors and auditors, direct access to trial participants' data, and how does it ensure that this access is both comprehensive and restricted to relevant data only?", "prev_section_summary": "This section of the document focuses on the guidelines for implementing clinical systems in clinical trials, specifically addressing the use of electronic medical records and other computerized systems at trial sites. Key topics include considerations for purchasing, developing, or updating computerized systems, site qualification by the sponsor, training requirements for using systems in clinical trials, and documentation of medical oversight when electronic medical records are utilized. Entities mentioned include investigators, institutions, sponsors, trial participants, and clinical trial practitioners. The section emphasizes the importance of ensuring system functionality is fit for the clinical trial purpose and that systems are configured in a GCP compliant manner.", "excerpt_keywords": "Clinical trials, Data security, Access management, Electronic medical records, Sponsor representatives"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nshould allow the investigator to document the assessment and acknowledgement of information entered into the system by others.\n\n## a6.5 confidentiality\n\npseudonymised copies of electronic medical records may be provided to sponsors, or service providers working on their behalf, outside the clinical environment e.g. if needed for endpoint adjudication or safety assessments according to the protocol. national regulations need to be followed by the sites. in such cases there should be:\n\n- procedures in place at the site to redact copies of medical records, in order to protect the trial participants identity, before transfer;\n- security measures in place, which are relevant to the process, including pseudonymisation and redaction;\n- a copy of the pseudonymised records and a proof of the transfer made at the site;\n- organisational and technical procedures in place on the receiving side to ensure that the requirements of the data protection regulation are met.\n\ndue to the sensitive nature of information documented in medical records, the extent to which sponsors request these data should be ethically and scientifically justified and limited to specific critical information. any planned collection of redacted copies of medical records by the sponsor should be described in the protocol, or related documents, and should be explicit in the patient information.\n\n## a6.6 security\n\nsecurity measures that prevent unauthorised access to data and documents should be maintained. please refer to section 5.4. regarding more details on the general requirements for security systems, which are equally applicable to research institutions.\n\n## a6.7 user management\n\nrobust procedures on user management should be implemented (see annex 3). for systems deployed by the investigator/institution, the investigator should ensure that individuals have secure and attributable access appropriate to the tasks they are delegated to in the trial. robust processes for access rights are particularly important in trials where parts of the information could unblind the treatment. such information should only be accessible to unblinded staff.\n\n## a6.8 direct access\n\nsponsor representatives (monitors and auditors) and inspectors should have direct, read-only access to all relevant data for all trial participants as determined by the monitors, auditors or inspectors while taking the collected data and the clinical trial protocol into account. this may require access to several different sections or modules of the respective (medical) record e.g. imaging. this requires the use of a unique identification method e.g. username and password. the access of monitors, auditors and inspectors should be restricted to the trial participants (including potential participants screened but not enrolled in the trial) and should include access to audit trails.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "564fe347-2515-4fcb-a646-0d124eb5c6bf": {"__data__": {"id_": "564fe347-2515-4fcb-a646-0d124eb5c6bf", "embedding": null, "metadata": {"page_label": "52", "file_name": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf", "file_type": "application/pdf", "file_size": 1032611, "creation_date": "2024-04-07", "last_modified_date": "2024-04-04", "document_title": "\"Securing Remote Access and Data Archiving for Clinical Trials: Ensuring Compliance and Security Measures\"", "questions_this_excerpt_can_answer": "1. What specific measures should be taken to ensure the security and integrity of patient data when providing remote access in clinical trials, according to the EMA guideline?\n \n2. How does the EMA guideline address the relationship between trial-specific data acquisition tools and the electronic medical record, particularly in terms of legal requirements and the management of patient information?\n\n3. What are the EMA's recommendations for the archiving of electronic data and metadata in clinical trials to comply with regulatory retention requirements, and how does it suggest handling potential issues such as institution relocation or closure?", "prev_section_summary": "The section discusses the importance of ensuring data security and access management in clinical trials. Key topics include procedures for maintaining confidentiality when sharing electronic medical records with sponsors, the need for security measures such as pseudonymisation and redaction, user management procedures to ensure secure and attributable access, and guidelines for granting sponsor representatives direct access to relevant trial data. The section emphasizes the ethical and scientific justification for requesting sensitive medical records and the importance of restricting access to unblinded staff only.", "excerpt_keywords": "EMA, guideline, computerised systems, electronic data, clinical trials"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[2] EMA Guideline on computerised systems and electronic data in clinical trials.pdf\nif the site has accepted to provide remote access, appropriate security measures and procedures should be in place to support such access without jeopardising patient rights and data integrity and national legislation.\n\na6.9 trial specific data acquisition tools\n\nthe electronic medical record contains information, which is crucial for the management of patients and are designed to fulfil legal requirements.\n\nany trial specific data acquisition tools implemented cannot replace the medical record and their use should not result in a depletion of relevant information in the medical record.\n\nmonitoring activities should not be limited to information in the data acquisition tools and should also consider relevant information in the medical record.\n\nplease also refer to the published qualification opinion on esource direct data capture (ddc) ema/chmp/sawp/483349/2019.\n\na6.10 archiving\n\nappropriate archiving should be in place to ensure long term readability, reliability, retrievability of electronic data (and metadata), in line with regulatory retention requirements. please also refer to section 6.11. requirements for the retention of clinical trial data and documents are frequently different from requirements for other data and documents held by the investigators. it should be ensured that there is no premature destruction of clinical trial data in case of e.g. institution relocation or closure. it is the responsibility of the sponsor to inform the hospital, institution or practice as to when these documents will no longer need to be retained.\n\nthere are specific requirements for backup, etc. of electronic data, which can be seen in section 6.8 and which are equally applicable to research institutions. please also refer to the guideline on the content, management and archiving of the clinical trial master file (paper and/or electronic) ema/ins/gcp/856758/2018.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "08f45876-e4b2-45bc-af5a-e2d6d7798ab6": {"__data__": {"id_": "08f45876-e4b2-45bc-af5a-e2d6d7798ab6", "embedding": null, "metadata": {"page_label": "1", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Consultative Process Overview for the Adoption of the Data Quality Framework in EU Medicines Regulation by CHMP\"", "questions_this_excerpt_can_answer": "1. What is the specific date when the Data Quality Framework for EU Medicines Regulation was adopted by the Committee for Medicinal Products for Human Use (CHMP)?\n \n2. Can you detail the timeline and key milestones in the consultative process for the adoption of the Data Quality Framework in EU Medicines Regulation, including the start of the consultation period and the final adoption date?\n\n3. What are the official contact details and address for the European Medicines Agency (EMA) as provided in the document discussing the Data Quality Framework for EU Medicines Regulation?", "excerpt_keywords": "data quality framework, medicines regulation, data quality dimensions, primary use of data, secondary use of data"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# data quality framework for eu medicines regulation\n\ndate: 30 october 2023\n\ndata analytics and methods task force\n\nema/326985/2023\n\ndraft agreed by bdsg for release for consultation: 10 october 2022\n\nend of consultation (deadline for comments): 18 november 2022\n\nagreed by bdsg and mwp: 30 june 2023\n\nadopted by chmp: 30 october 2023\n\nkeywords: data quality framework, medicines regulation, data quality dimensions, primary and secondary use of data\n\nofficial address: domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries: refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question: go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "44cb4582-503f-4660-aaae-44e7165d483d": {"__data__": {"id_": "44cb4582-503f-4660-aaae-44e7165d483d", "embedding": null, "metadata": {"page_label": "2", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing EU Medicines Regulation: A Comprehensive Guide to Data Quality Framework - Definitions, Dimensions, and Recommendations\"", "questions_this_excerpt_can_answer": "1. What are the specific dimensions and metrics proposed in the \"Enhancing EU Medicines Regulation: A Comprehensive Guide to Data Quality Framework\" for assessing data quality in the context of EU medicines regulation?\n\n2. How does the document define and differentiate between \"data\" and \"information\" within the framework of EU medicines regulation, and what implications does this distinction have for data quality management?\n\n3. What are the recommended general considerations and maturity models for maintaining and assessing data quality in the realm of EU medicines regulation, as outlined in the document?", "prev_section_summary": "The section provides information about the Data Quality Framework for EU Medicines Regulation, highlighting its adoption process by the Committee for Medicinal Products for Human Use (CHMP). The key dates mentioned include the start of the consultation period on 10 October 2022, the end of the consultation period on 18 November 2022, the agreement by BDSG and MWP on 30 June 2023, and the final adoption by CHMP on 30 October 2023. The document is associated with the Data Analytics and Methods Task Force and carries the reference number EMA/326985/2023. It discusses the importance of data quality dimensions and the primary and secondary use of data within the context of medicines regulation in the EU. Additionally, the official address and contact details for the European Medicines Agency (EMA) are provided, including their location in Amsterdam, the Netherlands, and how to contact them via their website or telephone.", "excerpt_keywords": "data quality, EU medicines regulation, reliability, coherence, timeliness"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# table of contents\n\n|1. executive summary|4|\n|---|---|\n|2. abbreviations|5|\n|3. background - the need for a data quality framework for medicines regulation|5|\n|4. scope of this dqf|6|\n|4.1. definition of data|7|\n|4.2. definition of dq|7|\n|4.3. limitations of scope|7|\n|4.4. structure of this dqf|8|\n|5. general considerations underlying the maintenance and assessment of dq|8|\n|5.1. dq determinants for evidence generation|8|\n|5.2. dq along the evidence generation process|10|\n|5.2.1. dq vs standardisation|11|\n|5.2.2. primary vs secondary use of data|11|\n|5.2.3. publication vs data consumption|12|\n|5.3. data and metadata|12|\n|5.4. data immutability|13|\n|5.5. data vs information|13|\n|5.6. frame of reference (validation vs verification)|13|\n|5.7. granularity of data and dq|13|\n|6. dq dimensions and metrics|14|\n|6.1. reliability|15|\n|6.1.1. when considering the \"fit for purpose\" definition of quality, reliability covers how correct and true the data are. reliability sub-dimensions|15|\n|6.1.2. considerations for reliability|16|\n|6.1.3. examples of reliability metrics|17|\n|6.2. extensiveness|19|\n|6.2.1. sub-dimensions of extensiveness|19|\n|6.2.2. considerations for extensiveness|20|\n|6.2.3. examples of metrics for extensiveness|20|\n|6.3. coherence|20|\n|6.3.1. sub-dimensions of coherence|21|\n|6.3.2. considerations for coherence|21|\n|6.3.3. examples of metrics for coherence|23|\n|6.4. timeliness|25|\n|6.4.1. sub-dimensions of timeliness|25|\n|6.4.2. considerations for timeliness|25|\n|6.4.3. examples of metrics for timeliness|25|\n|6.5. relevance|25|\n|6.5.1. examples of metrics for relevance|26|\n|7. general recommendations and maturity models|26|\n|7.1. foundational determinants: recommendation and maturity levels|32|\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 2/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "08a7e067-a6b7-45a0-8132-4689a6bb3b33": {"__data__": {"id_": "08a7e067-a6b7-45a0-8132-4689a6bb3b33", "embedding": null, "metadata": {"page_label": "3", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Implementing a Data Quality Framework in EU Medicines Regulation: Determinants, Levels, and Key Considerations\"", "questions_this_excerpt_can_answer": "1. What are the four levels of maturity in implementing a Data Quality Framework (DQF) in EU medicines regulation, and how are they defined in terms of documentation, formalization, implementation, and automation?\n\n2. How does the document describe the progression of intrinsic determinants of data quality from being intrinsic to incorporating feedback mechanisms, and what are the specific maturity levels associated with each stage?\n\n3. What specific roles do quality at source, master data management (MDM), Quality Management Systems (QMS), computerized systems, ISO, and industry standards play in the implementation of a Data Quality Framework within the context of EU medicines regulation, according to the document?", "prev_section_summary": "The section provides an overview of the contents of a document titled \"Enhancing EU Medicines Regulation: A Comprehensive Guide to Data Quality Framework - Definitions, Dimensions, and Recommendations.\" The document is structured to address various aspects of data quality (DQ) within the context of EU medicines regulation, as outlined by the European Medicines Agency (EMA) with the document reference ema/326985/2023.\n\nKey topics covered in the document include:\n\n1. **Executive Summary**: A brief overview of the document's purpose and key findings.\n2. **Abbreviations**: A list of abbreviations used throughout the document.\n3. **Background**: Discusses the necessity of a data quality framework for medicines regulation.\n4. **Scope of the Data Quality Framework (DQF)**: This section is further divided into sub-sections that define data and data quality (DQ), outline the limitations of the scope, and describe the structure of the DQF.\n5. **General Considerations for DQ**: Explores foundational concepts for maintaining and assessing data quality, including determinants for evidence generation, the process of evidence generation, data vs. information, data immutability, and the granularity of data and DQ.\n6. **DQ Dimensions and Metrics**: Detailed discussion on the dimensions of data quality such as reliability, extensiveness, coherence, timeliness, and relevance, including their sub-dimensions, considerations, and examples of metrics.\n7. **General Recommendations and Maturity Models**: Offers recommendations for improving data quality and outlines maturity models for assessing and enhancing DQ within the realm of EU medicines regulation.\n\nEntities mentioned include:\n- **Data Quality (DQ)**: Refers to the overall utility of data as determined by various dimensions.\n- **Evidence Generation Process**: The process through which data is collected, analyzed, and used to make regulatory decisions.\n- **Data vs. Information**: Differentiation between raw data and processed data (information) and its implications for DQ management.\n- **Data Immutability**: The concept that data should not be altered or tampered with.\n- **Granularity**: The level of detail or specificity of the data and its impact on data quality.\n- **Reliability, Extensiveness, Coherence, Timeliness, Relevance**: Dimensions of data quality, each with its own set of considerations and metrics for assessment.\n\nThe document aims to provide a comprehensive framework for assessing and enhancing data quality in the context of EU medicines regulation, offering specific dimensions and metrics for evaluation, as well as general considerations and recommendations for maintaining high data quality standards.", "excerpt_keywords": "Data Quality Framework, EU Medicines Regulation, Maturity Levels, Master Data Management, Quality Management Systems"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 7.1.1. level 1: documented\n\n....................................................................................... 32\n\n# 7.1.2. level 2: formalised\n\n.......................................................................................... 33\n\n# 7.1.3. level 3: implemented\n\n...................................................................................... 33\n\n# 7.1.4. level 4: automated\n\n.......................................................................................... 33\n\n# 7.2. intrinsic determinants: recommendations and maturity levels\n\n......................................................................................... 33\n\n# 7.2.1. level 0: intrinsic\n\n.............................................................................................. 33\n\n# 7.2.2. level 1: metadata\n\n........................................................................................... 33\n\n# 7.2.3. level 2: standardised\n\n....................................................................................... 34\n\n# 7.2.4. level 3: automated\n\n.......................................................................................... 34\n\n# 7.2.5. level 4: feedback\n\n............................................................................................ 34\n\n# 7.3. question-specific determinants: recommendations and maturity levels\n\n......................................................................................... 34\n\n# 7.3.1. level 1: ad-hoc\n\n.............................................................................................. 34\n\n# 7.3.2. level 2: domain-defined\n\n.................................................................................... 34\n\n# 7.3.3. level 3: question-defined\n\n................................................................................. 34\n\n# 8. considerations for implementation of dqf\n\n............................................................................ 35\n\n# 8.1. quality at source\n\n................................................................................................ 35\n\n# 8.2. the role of master data management (mdm) and reference data\n\n...................................................................... 35\n\n# 8.3. the role of qms and computerised systems\n\n........................................................................... 35\n\n# 8.4. the role of iso and industry standards\n\n................................................................................. 36\n\n# 8.5. notes on alcoa +\n\n............................................................................................... 37\n\n# 8.6. notes on implementation of dq controls\n\n................................................................................. 37\n\n# 9. glossary\n\n................................................................................................. 39\n\n# 10. references\n\n.......................................................................................... 42\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023 page 3/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "595e6de2-1c6b-4393-8128-b6448b1b56d0": {"__data__": {"id_": "595e6de2-1c6b-4393-8128-b6448b1b56d0", "embedding": null, "metadata": {"page_label": "4", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Regulatory Decision-Making: The EU Data Quality Framework for Medicines Regulation with Principles, Dimensions, and a Maturity Model\"", "questions_this_excerpt_can_answer": "1. What are the specific data quality dimensions and sub-dimensions outlined in the EU Data Quality Framework for medicines regulation, and how are they characterized and measured?\n \n2. How does the EU Data Quality Framework propose to evolve the automation processes to support data-driven regulatory decision-making in the context of medicines regulation?\n\n3. What is the role of the maturity model introduced in the EU Data Quality Framework, and how does it guide regulatory entities in improving their data quality practices for better decision-making in medicines regulation?", "prev_section_summary": "This section outlines the structure and content of a document focused on implementing a Data Quality Framework (DQF) within the context of EU medicines regulation. The document is structured into several key areas, each addressing different aspects of data quality and its management:\n\n1. **Levels of Maturity in DQF Implementation**: The document categorizes the maturity of implementing a Data Quality Framework into four levels:\n - Level 1: Documented\n - Level 2: Formalized\n - Level 3: Implemented\n - Level 4: Automated\n\n2. **Intrinsic Determinants of Data Quality**: It discusses the progression of data quality from intrinsic characteristics through to feedback mechanisms, across five maturity levels:\n - Level 0: Intrinsic\n - Level 1: Metadata\n - Level 2: Standardized\n - Level 3: Automated\n - Level 4: Feedback\n\n3. **Question-Specific Determinants**: This part addresses how data quality considerations can vary depending on specific questions or contexts, with three maturity levels outlined:\n - Level 1: Ad-hoc\n - Level 2: Domain-Defined\n - Level 3: Question-Defined\n\n4. **Considerations for DQF Implementation**: Several key considerations and components for implementing a Data Quality Framework are discussed, including:\n - Quality at Source\n - The role of Master Data Management (MDM) and Reference Data\n - The role of Quality Management Systems (QMS) and Computerized Systems\n - The role of ISO and Industry Standards\n - Notes on ALCOA+ (a framework for ensuring data integrity)\n - Notes on the implementation of Data Quality Controls\n\n5. **Additional Sections**: The document also includes a glossary and references section, providing definitions of terms used and citing sources.\n\nOverall, the document serves as a comprehensive guide for implementing a Data Quality Framework in the context of EU medicines regulation, covering the theoretical underpinnings, practical considerations, and specific roles of various data quality components and standards.", "excerpt_keywords": "Data Quality Framework, EU Medicines Regulation, Regulatory Decision-Making, Maturity Model, Automation Processes"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# executive summary\n\nthis document is the first release of the eu data quality framework (dqf) for medicines regulation and defines high-level principles and procedures that apply across emas regulatory mandate. this framework provides general considerations on data quality that are relevant for regulatory decision making, definitions for data quality dimensions and sub-dimensions, as well as their characterization and related metrics. it provides an analysis of what data quality actions and metrics should be considered in different use cases and introduces a maturity model to guide the evolution of automation to support data-driven regulatory decision making.\n\nthis document is intended to be a general resource from which more focused recommendations can be derived for specific regulatory domains with specified metrics and checks. see figure 1 for a summarized representation of the key points of the dqf.\n\n|data quality determinants|data quality dimensions|\n|---|---|\n|determinants|are data sufficient? extensiveness|\n|question|specific definition of data quality question|\n|reliability|are data faithfully representing what it is meant to? meaning|\n|coherence|are data analyzable? relational|\n|foundational|characterization of determinants for the development of the maturity models|\n|intrinsic|question-specific aspects that are inherent to data collection and purpose generation|\n|process| |\n|relevance|are data of the right kind?|\n|timeliness|are data available at the right time?|\n|definition|data life cycle|\n|data requirement| |\n|data collection| |\n|data managing and processing| |\n|data publishing| |\n|delivery| |\n|testing|acceptance|\n|data procurement| |\n\nfigure 1 - representation of the key points of the data quality framework\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 4/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0a85a178-f894-4802-bdb2-d68169832fb9": {"__data__": {"id_": "0a85a178-f894-4802-bdb2-d68169832fb9", "embedding": null, "metadata": {"page_label": "5", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing EU Medicines Regulation Through a Data Quality Framework: Leveraging Big Data and Digitalisation for Improved Regulatory Decision-Making\"", "questions_this_excerpt_can_answer": "1. What specific recommendations did the HMA-EMA Joint Big Data Task Force make regarding the establishment of an EU framework for data quality and representativeness in medicines regulation?\n\n2. How does the EU medicines regulatory framework intend to adapt to the shift from document-based submissions to direct assessments of underlying data, and what role does standardisation play in this transition?\n\n3. What are the identified challenges and opportunities presented by digitalisation and information technology in the context of regulatory decision-making for EU medicines, according to the document?", "prev_section_summary": "The section provides an overview of the first release of the EU Data Quality Framework (DQF) for medicines regulation, which is designed to establish high-level principles and procedures applicable across the European Medicines Agency's (EMA) regulatory mandate. The framework emphasizes the importance of data quality in regulatory decision-making and outlines specific dimensions and sub-dimensions of data quality, including their characterization and metrics. It also introduces a maturity model to guide the evolution of automation processes in support of data-driven decision-making within the context of medicines regulation.\n\nKey topics covered in the section include:\n- The purpose and scope of the EU Data Quality Framework for medicines regulation.\n- General considerations on data quality relevant to regulatory decision-making.\n- Definitions, characterizations, and metrics for data quality dimensions and sub-dimensions.\n- The introduction of a maturity model for guiding automation and improving data-driven regulatory decision-making.\n\nEntities and concepts highlighted in the section include:\n- Data quality determinants (e.g., extensiveness, meaning, relational aspects).\n- Data quality dimensions (e.g., reliability, coherence, relevance, timeliness).\n- The data life cycle (including data requirement, collection, managing and processing, publishing, delivery, and testing for acceptance).\n- The role of data procurement in ensuring data quality.\n\nThe document serves as a general resource for developing more focused recommendations and metrics for specific regulatory domains, aiming to enhance the quality of data used in medicines regulation and ultimately improve regulatory decision-making processes.", "excerpt_keywords": "Keywords: data quality framework, EU medicines regulation, big data, standardisation, digitalisation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# abbreviations\n\n|cdm|common data model|\n|---|---|\n|chmp|committee for medicinal products for human use|\n|dq|data quality|\n|dqf|data quality framework|\n|ehr|electronic health record|\n|ehds|european health data space|\n|ema|european medicines agency|\n|etl|extract, transform and load|\n|fair|findable, accessible, interoperable and reusable|\n|gxp|good x practices, where x stands for laboratory (glp), clinical (gcp), manufacturing (gmp), distribution or documentation (gdp)|\n|icsr|individual case safety reports|\n|iso|international organisation for standardisation|\n|mdm|master data management|\n|qms|quality management system|\n|qsr|quality system regulation|\n|rwd|real-world data|\n|rwe|real-world evidence|\n\n# background - the need for a data quality framework for medicines regulation\n\nas acknowledged in the recommendations of the hma-ema joint big data task force and the workplan\nof the hma-ema joint big data steering group, establishing an eu framework for data quality (dq)\nand representativeness is a critical element for realising the full potential of (big) data and driving\nregulatory decisions.\n\nin recent years, the eu regulatory assessment process has been exploring a shift from document-based submissions to direct assessments of the data underlying those submissions. to facilitate this\npotential shift, there is an increased need for standardisation [1], and the need for a framework, which\nwould characterise dq and would allow the regulator to make reliable assessments of whether the data\nare fit for the purpose of decision making.\n\nin addition, the progress in digitalisation and information technology creates new opportunities, but\nalso contributes to an increasingly complex landscape for regulatory decision making. while new types\nof data become available, guidelines or methods to demonstrate whether such data are adequate for\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 5/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cda93fbd-33f4-41f4-bf3e-9af0d813a1db": {"__data__": {"id_": "cda93fbd-33f4-41f4-bf3e-9af0d813a1db", "embedding": null, "metadata": {"page_label": "6", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Integrating Real-World Data into EU Medicines Regulation: A Framework for Enhancing Data Quality and Decision-Making\"", "questions_this_excerpt_can_answer": "1. What are the primary objectives of the Data Quality Framework (DQF) as outlined in the document \"Integrating Real-World Data into EU Medicines Regulation: A Framework for Enhancing Data Quality and Decision-Making\"?\n\n2. How does the document propose to integrate and assess real-world data (RWD) alongside traditional clinical trial data to support regulatory decision-making for medicines within the EU?\n\n3. What specific types of data does the Data Quality Framework aim to encompass, and how does it plan to address the varying methods, terminologies, metrics, and issues across these data types to ensure consistency in data quality related processes for regulatory decision-making?", "prev_section_summary": "This section introduces the concept of a Data Quality Framework (DQF) for EU medicines regulation, highlighting its significance as recognized by the HMA-EMA Joint Big Data Task Force and the HMA-EMA Joint Big Data Steering Group. The establishment of an EU framework for data quality (DQ) and representativeness is emphasized as crucial for leveraging the full potential of big data in regulatory decisions. The document discusses the ongoing shift in the EU regulatory assessment process from document-based submissions to direct assessments of underlying data, underlining the increased need for standardization to support this transition. It also addresses the challenges and opportunities presented by advancements in digitalization and information technology in the regulatory decision-making landscape. Key entities and concepts mentioned include common data models (CDM), electronic health records (EHR), the European Health Data Space (EHDS), the European Medicines Agency (EMA), and real-world data (RWD) and evidence (RWE), among others. The section sets the stage for a detailed discussion on the importance of a data quality framework in enhancing the EU medicines regulatory framework through improved data standardization, reliability, and decision-making capabilities.", "excerpt_keywords": "data quality framework, real-world data, regulatory decision-making, EU medicines regulation, horizontal system"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\ndecision making are still scarce. therefore, a data quality framework (dqf) is needed to guide coherent and consistent quality assessment procedures.\n\none notable example is healthcare data that are becoming available in increasing quantity to potentially support regulatory decision making for medicines. information derived from routinely collected real-world data (rwd) has for a long time been used to support regulatory decision making on the safety of drugs in the post-authorisation phase. while most traditional pre-approval randomised controlled clinical trials remain the fundamental method of establishing the safety and efficacy of medicines during the pre-authorisation phase, they could potentially benefit from the evidence generated using this data. insights into the real-world are also required by downstream stakeholders including health technology assessment bodies, payers and ultimately clinicians and patients. bridging these gaps, the regulatory network needs to acquire the ability to describe and quantify the degree to which these data are accurate and fit for purpose.\n\n# scope of this dqf\n\nthis document aims to provide a set of definitions, principles and guidelines that can coherently be applied to any data source for the purpose of characterising, assessing, and assuring dq for regulatory decision making. this framework is intended to encompass primary and secondary use, as well as metadata and supporting information (e.g., mdm (master data management), underlying reference data) applicable to support committee for medicinal products for human use (chmp) decision making. the document is targeted primarily at the eu medicine regulatory network, but the relevance of the content can be of interest to a wider range of stakeholders such as marketing authorisation holders, data source holders, researchers, and patient associations.\n\nas methods, terminologies, metrics, and issues vary across data types and sources, this framework seeks to provide a coherent basis to identify, define, and further develop dq assessment procedures and recommendations for current and novel data types.\n\nobjectives of this framework are therefore to achieve consistency in dq related processes, foster the development of horizontal systems for dq and eventually enable a more adequate and automated use of data for regulatory decision making.\n\nthis framework builds on the recommendations of tehdas [2] and extends them with a classification of quality dimensions and assessment criteria, as well as guidelines for their application. it builds on the definitions and recommendations that have been proposed in several existing dq frameworks, including [2-13].\n\nwhile many examples provided in this framework relate to real-world data, the scope of this framework extends to a broad range of regulatory activities and their respective data types, including real-world data [14, 15] (including within clinical trials to supplement trial-specific data collection), bioanalytical omics data, animal health data, preclinical data (cell and animal-based laboratory data), spontaneous adverse event reporting data, chemical and manufacturing control data, and more.\n\na \"horizontal system\" provides a specific set of functionalities across a variety of use cases. in this case the intentions to develop systems and approaches to dq that can be used (and potentially shared) across use cases. \"horizontal system\" is defined by contrast to \"vertical system\", where all dq processes and system would be developed ad hoc and targeted to a specific use case.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 6/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "18f4079e-64c7-45df-ab84-4f189a2a47ae": {"__data__": {"id_": "18f4079e-64c7-45df-ab84-4f189a2a47ae", "embedding": null, "metadata": {"page_label": "7", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Developing a Comprehensive Data Quality Framework for Enhancing Regulatory Decision-Making in Health Research and Policy: Definitions, Scope, and Limitations\"", "questions_this_excerpt_can_answer": "1. How does the Data Quality Framework (DQF) for EU medicines regulation define the concept of data quality in the context of health research, policy making, and regulation?\n \n2. What are the specific limitations of scope outlined by the DQF in relation to regulatory decision-making within the EU medicines regulation context?\n\n3. How does the DQF differentiate between the quality of data and the quality of the underlying elements the data refer to, particularly in the context of medicinal product purity?", "prev_section_summary": "The section discusses the need for a Data Quality Framework (DQF) to guide coherent and consistent quality assessment procedures for regulatory decision-making in the EU medicines regulation context. It highlights the increasing availability of healthcare data, particularly real-world data (RWD), which has traditionally supported regulatory decisions on drug safety in the post-authorization phase. The document emphasizes the potential benefits of integrating RWD with traditional clinical trial data to enhance decision-making processes, not only for regulatory bodies but also for health technology assessment bodies, payers, clinicians, and patients.\n\nThe scope of the DQF is to provide definitions, principles, and guidelines applicable to various data sources to ensure data quality (DQ) for regulatory decision-making. It aims to cover primary and secondary use data, metadata, and supporting information relevant to the Committee for Medicinal Products for Human Use (CHMP) decisions. The framework is designed for the EU medicine regulatory network but is also relevant to other stakeholders like marketing authorization holders, data source holders, researchers, and patient associations.\n\nThe document seeks to address the challenges posed by varying methods, terminologies, metrics, and issues across different data types and sources by offering a coherent basis for identifying, defining, and developing DQ assessment procedures. The objectives include achieving consistency in DQ-related processes, fostering the development of horizontal systems for DQ, and enabling more adequate and automated data use for regulatory decision-making.\n\nThe framework builds on previous recommendations and extends them with a classification of quality dimensions, assessment criteria, and application guidelines. It encompasses a wide range of regulatory activities and data types, including real-world data, bioanalytical omics data, animal health data, preclinical data, spontaneous adverse event reporting data, and chemical and manufacturing control data. The concept of a \"horizontal system\" is introduced to describe a set of functionalities that can be applied across various use cases, contrasting with \"vertical systems\" developed for specific use cases.", "excerpt_keywords": "Data Quality Framework, EU medicines regulation, regulatory decision-making, health research, data transparency"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 4.1. definition of data\n\nin this dqf, data are considered as any information asset that represents measurements or observations and that can be used to support decision making, directly or indirectly through analysis.\n\n# 4.2. definition of dq\n\nin general terms, quality is defined as an attribute of a product or service that defines the degree to which it meets customer and other stakeholder needs within statutory and regulatory requirements or its fitness for intended use[2]. the same principle applies to data and for the purpose of this document, the following definition is adopted:\n\ndata quality is defined as: \"fitness for purpose for users needs in relation to health research, policy making, and regulation and that the data reflect the reality, which they aim to represent\" [2].\n\ntherefore, this dqf restricts its scope to determinants of dq that are relevant for regulatory decision making.\n\n# 4.3. limitations of scope\n\nfollowing the definition of dq and the restricted focus on regulatory decision making this frameworks scope excludes:\n\n- analytical methods to derive evidence, i.e., conclusions and insights, from underlying data. this framework focuses on defining guidelines about assessing the level of the quality of the data used for regulatory decisions, not on their actual usage for regulatory decision making and the methods involved. while data quality and methods for evidence generation are effectively a continuum in terms of decision making, when taking the perspective of data collection, dissemination, and re-use, they are distinct.\n- aspects of dq that do not directly impact regulatory decision making e.g., conciseness or accessibility. for instance, conciseness is a relevant dimension of data quality in that it affects fitness for purpose when transmitting or archiving large datasets (e.g., for genomics data). however, it is not relevant in terms of data being fit for purpose to answer a specific (regulatory) question. accessibility can also be an important aspect of dq, but in the context of a regulatory activity, data is by definition accessible to interested parties.\n- data transparency, intended as the characteristic of data being used lawfully, traceably and for valid purposes is also excluded from this framework. issues related to data transparency go beyond data quality assessment in support of decision making. rather, this framework provides guidelines that can be part of a broader set of recommendations to support data transparency. as for accessibility, it should be noted that there may be an indirect impact of transparency to data quality, and this will be addressed, when relevant, in extensions of this framework.\n- quality of the underlying elements the data refer to e.g., when considering a dataset about the purity of a medicine, this framework will cover the reliability, completeness, and other aspects of.\n\nnote that reality is in general not fully observable. in this definition we consider how data reflects aspects of reality that data is designed to capture (e.g.: disease frequency in a population). context is important to understand how what is observed relates to whole.\n\nin some cases, aspects of dq that do not directly relate to decision making, may have an indirect impact on it, e.g., a data source that is broadly accessible will likely be less opaque, more validated, and potentially of higher quality. such aspects might be considered and possibly quantified in future extensions of this framework.\n\nas an example, the eu gdpr regulation defines transparency as: \"the principle of transparency requires that any information addressed to the public or to the data subject be concise, easily accessible and easy to understand\". this poses a broader set of requirements than what is strictly related to the use of data for regulatory decision making.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 7/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e37d3e5f-7696-4cc0-9456-0d19c2be245c": {"__data__": {"id_": "e37d3e5f-7696-4cc0-9456-0d19c2be245c", "embedding": null, "metadata": {"page_label": "8", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing and Enhancing Data Quality for Regulatory Decision-Making in EU Medicines Regulation: A Comprehensive Framework\"", "questions_this_excerpt_can_answer": "1. What are the key determinants of data quality (dq) when generating evidence for regulatory purposes in the context of EU medicines regulation, as outlined in the \"Assessing and Enhancing Data Quality for Regulatory Decision-Making in EU Medicines Regulation: A Comprehensive Framework\"?\n\n2. How does the document define the relationship between \"use-case\" and \"regulatory question\" within the framework of assessing and enhancing data quality for regulatory decision-making in EU medicines regulation?\n\n3. According to the \"Assessing and Enhancing Data Quality for Regulatory Decision-Making in EU Medicines Regulation: A Comprehensive Framework,\" how should the diversity of data sources and their respective generation processes be considered when evaluating data quality for regulatory decision-making in the EU?", "prev_section_summary": "This section from the document outlines the Data Quality Framework (DQF) for EU medicines regulation, focusing on the definition of data quality (DQ), its scope, and limitations within the context of health research, policy making, and regulatory decision-making. Key topics and entities include:\n\n1. **Definition of Data**: Data is described as any information asset that represents measurements or observations useful for supporting decision-making, either directly or through analysis.\n\n2. **Definition of Data Quality (DQ)**: DQ is defined in terms of \"fitness for purpose for users' needs\" in relation to health research, policy making, and regulation. It emphasizes that data should accurately reflect the reality they aim to represent, focusing on the relevance of data quality for regulatory decision-making.\n\n3. **Limitations of Scope**: The DQF explicitly excludes certain aspects from its scope, such as:\n - Analytical methods for deriving evidence from data.\n - Aspects of DQ not directly impacting regulatory decision-making, like conciseness or accessibility.\n - Data transparency, which, while important, is considered beyond the immediate scope of data quality assessment for decision-making.\n\n4. **Quality of Underlying Elements**: The framework distinguishes between the quality of data and the quality of the underlying elements the data refer to, such as the purity of a medicine. It acknowledges that reality is not fully observable and that context is crucial for understanding how data reflects aspects of reality.\n\n5. **Indirect Impact on Decision Making**: It is noted that some aspects excluded from the direct scope, like accessibility and transparency, might indirectly impact decision-making quality. These aspects may be considered in future extensions of the framework.\n\n6. **EU GDPR Regulation on Transparency**: The document references the EU GDPR's definition of transparency to illustrate broader requirements that extend beyond the use of data for regulatory decision-making.\n\nThis section establishes a foundational understanding of how the DQF approaches data quality within the EU medicines regulation context, setting the stage for further guidelines and recommendations to enhance regulatory decision-making through high-quality data.", "excerpt_keywords": "data quality, regulatory decision-making, EU medicines regulation, evidence generation, use-case"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# general considerations underlying the maintenance and assessment of dq\n\n# dq determinants for evidence generation\n\nthe landscape of data that can be potentially used for regulatory purposes extends to diverse data sources, each generated through different processes and fit for different primary and secondary uses. when considering the overall quality of a dataset at the point of regulatory decision making, it is important to distinguish what contributes to quality, and what can be measured or controlled at what\n\nin the context of this framework, \"use-case\" is used as a broader synonym of \"regulatory question\", when referring to a set of related questions and related activities.\n\n# data quality framework for eu medicines regulation\n\nema/326985/2023 page 8/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "9f191592-ba27-463d-a7d5-630d7a58cbaf": {"__data__": {"id_": "9f191592-ba27-463d-a7d5-630d7a58cbaf", "embedding": null, "metadata": {"page_label": "9", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing the Determinants of Data Quality in EU Medicines Regulation: Foundational, Intrinsic, and Question Specific Perspectives\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework categorize the different elements that impact the quality of data used in regulatory decision-making?\n\n2. What are examples of foundational determinants of data quality in the context of EU medicines regulation, and how do they influence the trustworthiness of data for regulatory decisions?\n\n3. In the EU medicines regulation data quality framework, how are question specific determinants defined, and what role do they play in assessing the fitness of a dataset for answering specific regulatory or scientific questions?", "prev_section_summary": "This section from the document titled \"Assessing and Enhancing Data Quality for Regulatory Decision-Making in EU Medicines Regulation: A Comprehensive Framework\" discusses the critical aspects of maintaining and assessing data quality (DQ) in the context of EU medicines regulation. It highlights the importance of understanding the diverse sources of data that can be used for regulatory purposes, emphasizing that these data sources are generated through various processes and are suited for different uses. The document outlines the key determinants of data quality when generating evidence for regulatory decision-making, stressing the need to distinguish factors that contribute to data quality and those that can be measured or controlled.\n\nFurthermore, the document clarifies the relationship between \"use-case\" and \"regulatory question\" within the framework, using \"use-case\" as a broader term that encompasses a set of related regulatory questions and activities. This distinction is crucial for assessing and enhancing data quality in regulatory decision-making processes. The excerpt also suggests that evaluating data quality for regulatory decisions in the EU requires careful consideration of the diversity of data sources and their generation processes. This approach ensures that data used in regulatory decision-making is of high quality and fit for purpose. The reference to \"EMA/326985/2023\" indicates the document's identification within the European Medicines Agency's publications.", "excerpt_keywords": "data quality, EU medicines regulation, foundational determinants, intrinsic determinants, question specific determinants"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# data quality determinants\n\nin this framework, such elements related to dq are referred to as \"determinants\" and classified into three categories:\n\n|foundational determinants| |\n|---|---|\n|foundational determinants pertain to the processes and systems through which data are generated, collected, processed, and made available. foundational determinants are what affects the quality of data, but its not part of the data themselves. as such, they do not depend on, and cannot be completely derived from, the content of a dataset. for data to be trusted for regulatory decision making, the underlying infrastructure and processes that collect, host, transform and move the data must be designed in such a way that the correspondence between data and the real entity it represents is not altered. examples of foundational determinants are the use of certified software systems to collect and process data, the presence of processes, training, and audit processes to ensure data are properly recorded and documented, the validation and verifiability of data processing steps.| |\n|intrinsic determinants of data| |\n|intrinsic determinants of data pertain to aspects that are inherent to a given dataset. intrinsic determinants are what can be derived given a dataset and possibly some external generic knowledge, but without the context in which the data were generated, as well of the context the data will be used in (e.g., a scientific or regulatory question). examples of intrinsic determinants are coherent or incoherent formatting, the presence of errors (e.g., truncation) or the plausibility of the data.| |\n|question specific determinants| |\n|question specific determinants pertain to aspects of dq that cannot generally be defined independently of a specific question or approach to analysis. examples of question specific determinants are the acceptability of the completeness of a dataset, or its level of approximation (e.g., date expressed in dates or months) to answer a specific question.| |\n\nin general, foundational determinants have a direct impact on dq. when they cannot be controlled, the only option is to control the intrinsic aspects of dq. the scope of such control is limited in its ability to assure fitness for purpose when a question (or set of typical questions) is not defined.\n\nfigure 2 - determinants of data quality\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023\n\npage 9/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "95cca840-7166-42fe-99be-2306b0c23ac9": {"__data__": {"id_": "95cca840-7166-42fe-99be-2306b0c23ac9", "embedding": null, "metadata": {"page_label": "10", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Developing a Comprehensive Data Quality Framework for Evidence Generation Throughout the Lifecycle of EU Medicines Regulation: Processes and Key Determinants\"", "questions_this_excerpt_can_answer": "1. What are the specific stages involved in the data quality (dq) framework for evidence generation in the context of EU medicines regulation, and how do they contribute to ensuring the reliability and suitability of data for its intended use?\n\n2. How does the document describe the role of iterative feedback loops in the data quality checks process within the evidence generation lifecycle for EU medicines regulation, and what implications does this have for the management and improvement of data quality?\n\n3. In the context of EU medicines regulation, how is the fit-for-purpose assessment of data for secondary use conducted, and what determinants are considered critical for assessing data quality at various stages of the evidence generation process?", "prev_section_summary": "This section from the document titled \"Assessing the Determinants of Data Quality in EU Medicines Regulation: Foundational, Intrinsic, and Question Specific Perspectives\" outlines the framework for understanding the different elements that impact the quality of data used in regulatory decision-making within the context of EU medicines regulation. The framework categorizes these elements into three main types of determinants:\n\n1. **Foundational Determinants**: These are related to the processes and systems involved in generating, collecting, processing, and making data available. They are external to the data themselves and are crucial for ensuring the trustworthiness of data for regulatory decisions. Examples include the use of certified software systems, the presence of processes for proper data recording and documentation, and the validation and verifiability of data processing steps.\n\n2. **Intrinsic Determinants of Data**: These pertain to characteristics inherent to the dataset itself, such as formatting coherence, presence of errors, or data plausibility. These determinants can be assessed given the dataset and some external generic knowledge but do not depend on the specific context of data generation or intended use.\n\n3. **Question Specific Determinants**: These are aspects of data quality that are relevant to specific questions or analytical approaches. They include considerations like the acceptability of dataset completeness or the appropriateness of data granularity (e.g., dates vs. months) for answering a particular question.\n\nThe document emphasizes that foundational determinants directly impact data quality (DQ) and, when uncontrollable, necessitate a focus on controlling intrinsic aspects of DQ. However, this control is limited in ensuring data's fitness for purpose without a defined question or set of questions. This framework is part of the EU medicines regulation data quality framework, referenced as EMA/326985/2023, and is crucial for understanding how data quality is assessed and ensured in the context of EU medicines regulation.", "excerpt_keywords": "data quality, evidence generation, EU medicines regulation, fit for purpose assessment, iterative feedback loops"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 5.2. dq along the evidence generation process\n\ndata that are suitable and available for evidence generation go through a process (part of a broader \"life cycle\" 6) that is specific to the type of data, the processes and organisations that produce it. for data that is already collected (secondary use), a fit for purpose assessment for the intended use should be done prior to this process. dq checks occur at various steps of this process and may include iterative feedback loops.\n\nas a reference, a general high-level lifecycle is outlined as follows (see figure 3):\n\n- definition of data requirements: what data are sought, and what their characteristics should be. for primary data this phase can include elements directly related to evidence generation.\n- data collection or generation: gaining data reflecting the observed reality.\n- data management and processing: including data transfers, normalisation, and cleansing.\n- data publishing: making data available to consumers.\n- data procurement and aggregation: sourcing data from one or more consumers.\n- testing and acceptance: assessing the suitability of the procured data for intended needs.\n- delivery for consumption: using data to support a specific activity, e.g., analysis.\n\nnot all phases here presented are present in all data workflows (e.g., data collected from sensor or social data may be collected on a \"what is available\" basis, rather than based on specific requirements) and possibly extra phases may apply, and the order may differ.\n\nfor the scope of the assessment and management of dq, it is important to establish what determinants apply at which stage of this process, and what may be the impact. for instance, intrinsic aspects of dq can be measured and such measures could be used to improve reliability at the stage of data collection and generation, or it could be used to provide an assessment of quality at publication time. integration with additional data would require re-assessment. question-specific determinants of dq need to be assessed each time data are repurposed to answer a question it was not originally collected or designed for.\n\ndq checks occur at various steps along the evidence generation process and may include iterative feedback loops as indicated by the dashed line in figure 3.\n\n6 the data life cycle is broader in that it would extend to aspects of data disposal and maintenance beyond usage.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 10/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1eefa34c-688d-48ab-ba52-d656c2872f6a": {"__data__": {"id_": "1eefa34c-688d-48ab-ba52-d656c2872f6a", "embedding": null, "metadata": {"page_label": "11", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Frameworks and Processes for Ensuring Data Quality and Standardization in EU Medicines Regulation: Distinguishing Between Primary and Secondary Data Use\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework distinguish between data quality (DQ) and data standardization in the context of regulatory decision-making?\n \n2. What are the specific processes and systems recommended by the EU medicines regulation framework to assure data quality, especially in light of adopting standards and managing data according to the FAIR principles?\n\n3. What is the distinction between primary and secondary use of data as defined in the EU medicines regulation data quality framework, and how does this distinction impact the application of guidelines and metrics in data management and processing?", "prev_section_summary": "The section outlines a comprehensive data quality (dq) framework for evidence generation within the context of EU medicines regulation, emphasizing the lifecycle approach to managing and ensuring data quality. Key stages of this lifecycle include the definition of data requirements, data collection or generation, data management and processing, data publishing, data procurement and aggregation, testing and acceptance, and delivery for consumption. It highlights that not all stages are applicable to all data workflows and that the sequence of these stages may vary.\n\nThe document stresses the importance of conducting a fit-for-purpose assessment for data intended for secondary use before proceeding with the evidence generation process. This involves evaluating whether the data is suitable for its intended use, which is a critical step in ensuring data quality.\n\nData quality checks are described as occurring at various steps throughout the evidence generation process and may involve iterative feedback loops to continuously improve data quality. The framework also acknowledges the need for re-assessment of data quality when data is integrated with additional data or repurposed for new questions it was not originally collected or designed to answer.\n\nFurthermore, the document mentions the broader data lifecycle, which extends to include data disposal and maintenance beyond its initial usage, indicating a holistic approach to data management within the EU medicines regulation context.\n\nEntities mentioned include:\n- EU medicines regulation\n- Evidence generation process\n- Data quality (dq) checks\n- Iterative feedback loops\n- Fit-for-purpose assessment\n- Data lifecycle stages (definition of data requirements, data collection/generation, data management and processing, data publishing, data procurement and aggregation, testing and acceptance, delivery for consumption)", "excerpt_keywords": "data quality, EU medicines regulation, data standardization, FAIR principles, primary and secondary data use"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# definition of data requirement\n\n|data|data collection and/or generation|data managing and processing|data publishing|\n|---|---|---|---|\n|delivery for consumption|testing and acceptance|data procurement| |\n\n# data life cycle\n\nfigure 3 - a typical data processing workflow in the evidence generation process\n\n# 5.2.1. dq vs standardisation\n\nfrom the point of view of regulatory decision making, dq is distinct from data standardisation: data that are not fit for purpose in terms of answering a regulatory question will not become fit when standardised, and non-standardised data can be still used to answer a regulatory question. dq also applies to individual and non-standard data sources.\n\nthe implementation of systems and processes to assure dq is largely affected (and in some cases fully determined) by the adoption of standards as well as by data management recommendations (e.g., fair data [findable, accessible, interoperable and reusable]) [16], and the availability of resources such as ontologies and mdm systems that underpin semantic interoperability.\n\ntherefore, recommendations on specific standards and standardisation processes are not included in this framework, while adoption of standards does drive implementation maturity levels.\n\n# 5.2.2. primary vs secondary use of data\n\nprimary data collection is a process of collecting original data (newly collected), directly from the source. it can be gathered from observations, interviews and from biometrics (blood pressure, weight, blood tests, etc.) or surveys (questionnaires). primary use of data is the use of information for the specific purposes they were collected for, while secondary use of data involves using the data that have initially been gathered for other purposes. see the glossary for an explanation of primary and secondary data.\n\nin the application of guidelines and metrics, an important distinction arises between primary and secondary use of data. when systems are designed to collect and process data for a specified primary\n\nit should be noted that data standardization processes may alter the original information and its semantics. as noted later in this document, \"standardization at source\" is preferred to \"a-posteriori\" standardization.\n\n# data quality framework for eu medicines regulation\n\nema/326985/2023\n\npage 11/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "80083296-5afe-41f0-ada1-5e0f2b65caf5": {"__data__": {"id_": "80083296-5afe-41f0-ada1-5e0f2b65caf5", "embedding": null, "metadata": {"page_label": "12", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Improving Data-Driven Insights: Strategies for Data Quality Control, Metadata Application, and Contextual Analysis in Data Publication and Consumption\"", "questions_this_excerpt_can_answer": "1. How does the data quality (DQ) control process differ between the stages of data publication and data consumption within the EU medicines regulation framework, and what specific strategies are recommended for ensuring data quality during these stages?\n\n2. In the context of EU medicines regulation, how are metadata utilized to enhance data quality control, and what are the specific requirements for metadata in regulatory decision-making, including examples of metadata catalogues that support this process?\n\n3. What are the intrinsic and question-specific aspects of data quality considered during the collection and generation of data for evidence generation within the EU medicines regulatory framework, and how do these considerations vary when data are intended for secondary use?", "prev_section_summary": "This section from the document titled \"Frameworks and Processes for Ensuring Data Quality and Standardization in EU Medicines Regulation: Distinguishing Between Primary and Secondary Data Use\" discusses several key topics and entities related to data quality (DQ) and data standardization within the context of EU medicines regulation. The main points include:\n\n1. **Definition of Data Requirement**: It outlines the stages of data lifecycle from collection/generation, managing/processing, to publishing, with a specific focus on the delivery for consumption stage.\n\n2. **Data Life Cycle**: A typical data processing workflow in the evidence generation process is depicted, emphasizing the continuous nature of data handling from collection to publication.\n\n3. **DQ vs Standardization**: The document makes a clear distinction between data quality and data standardization, stating that data must be fit for purpose (DQ) regardless of whether it is standardized. However, the adoption of standards and management practices, such as those recommended by the FAIR principles (Findable, Accessible, Interoperable, and Reusable), are crucial for assuring data quality. It also mentions the role of resources like ontologies and MDM (Master Data Management) systems in achieving semantic interoperability.\n\n4. **Primary vs Secondary Use of Data**: This part defines primary data collection as the gathering of new data directly from the source for specific purposes. In contrast, secondary use of data refers to the utilization of data collected for other purposes. The document highlights the importance of distinguishing between these two uses in the application of guidelines and metrics, noting that standardization processes can alter the original data's information and semantics. A preference for \"standardization at source\" over \"a-posteriori\" standardization is expressed.\n\n5. **Data Quality Framework for EU Medicines Regulation**: The document is identified with a reference number (EMA/326985/2023) and suggests that the framework is comprehensive, covering aspects from data collection to its standardization and the implications of primary versus secondary data use in regulatory decision-making.\n\nOverall, the section emphasizes the importance of data quality and standardization in the regulatory decision-making process for EU medicines, advocating for the adoption of standards and FAIR principles to ensure data is effectively managed and utilized.", "excerpt_keywords": "data quality, EU medicines regulation, metadata, evidence generation, secondary data use"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 5. data quality considerations\n\n# 5.2. data quality in evidence generation\n\nwhen data are collected and generated for a specific purpose, or when there are established requirements for secondary use, intrinsic and question-specific aspects of data quality (dq) can be considered during collection and generation. systems and processes can be designed to ensure the required quality level for evidence generation. analysis decisions and process specifications can be made at this stage, with downstream analysis focusing on synthesizing results and assessing uncertainty levels. this is different for secondary data use, where quality criteria may not align with the original data collection purposes. in such cases, dq control is often based on intrinsic determinants.\n\n# 5.2.3. publication vs data consumption\n\nduring the data life cycle, data go through two main contexts: publication and consumption. in the publication context, data are generated, processed, and made available, such as aggregating data from multiple sources for general usage or generating data from wearables. in the consumption context, data are aggregated to support analysis, like integrating sources for a specific study.\n\nthese contexts may overlap, especially when data are collected for a specific primary use, or they may be distinct, such as data collected and published for various potential uses, typically for secondary analysis. quality assessment approaches may differ between publication and consumption contexts, with separate specifications for acceptable quality levels.\n\n# 5.3. data and metadata\n\nmetadata, known as \"data about data,\" offer context about data purpose and generation, including source characterization, processing steps, lineage, and data element definitions. the distinction between data and metadata is not always clear, as information considered metadata in one context may be seen as data in another. for regulatory decision-making, metadata should be treated similarly to data, especially if changes could impact conclusions.\n\nin a dq context, metadata should extend beyond metrics and summary descriptions to include source, process, and data element characterizations. metadata are often published in data catalogues to enable data discovery and fitness assessment without exposing the actual data.\n\nvarious metadata catalogues like darwin and encepp are being developed for this purpose. an example from statistics finland can be accessed at: https://www.aineistokatalogi.fi/catalog.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "0cd47f07-f7cb-4abf-b49d-469ee670c849": {"__data__": {"id_": "0cd47f07-f7cb-4abf-b49d-469ee670c849", "embedding": null, "metadata": {"page_label": "13", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing Data Quality and Immutability in the Context of EU Medicines Regulation: Distinguishing Between Data and Information\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework define the difference between data immutability and the process of updating evidence for regulatory purposes?\n \n2. In the context of EU medicines regulation, how is the distinction between data and information conceptualized, especially in terms of their roles in evidence generation for decision-making?\n\n3. What are the specific definitions of \"validation\" and \"verification\" within the EU medicines regulation data quality framework, and how do they differ from each other and from other common uses of \"validation\"?", "prev_section_summary": "The section discusses the importance of data quality (DQ) considerations in the context of EU medicines regulation, focusing on evidence generation, the distinction between data publication and consumption, and the role of metadata in enhancing data quality control.\n\n1. **Data Quality in Evidence Generation**: It highlights the importance of considering both intrinsic and question-specific aspects of DQ when collecting and generating data for a specific purpose or for secondary use. It emphasizes designing systems and processes to ensure the required quality level for evidence generation, with analysis decisions and process specifications made early on.\n\n2. **Publication vs. Data Consumption**: The text differentiates between the publication and consumption contexts within the data life cycle. In the publication context, data are generated, processed, and made available for general use or specific applications like wearables. In the consumption context, data are aggregated for analysis, such as integrating sources for a specific study. It notes that quality assessment approaches may vary between these contexts, with different specifications for acceptable quality levels.\n\n3. **Data and Metadata**: The section outlines the significance of metadata, or \"data about data,\" in providing context about data purpose and generation, including source characterization and processing steps. It stresses that metadata should be treated with the same importance as data in regulatory decision-making, especially if changes could impact conclusions. Metadata should include detailed characterizations to support data discovery and fitness assessment. Examples of metadata catalogues, such as DARWIN and ENCEPP, are mentioned, with a specific reference to Statistics Finland's catalogue for further illustration.\n\nOverall, the section underscores the critical role of data quality control and metadata in ensuring the integrity and utility of data within the EU medicines regulatory framework, particularly in the contexts of data publication and consumption for evidence generation.", "excerpt_keywords": "data immutability, evidence generation, validation, verification, metadata"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 5.4. data immutability\n\ndata about some measured or observed aspect of reality may change in time, both reflecting the actual change in the observed entities, and the change of the available information at some given time. for instance, the weight of an individual may change, both due to a change of the individual itself, or because of a more accurate reading superseding a previous measurement.\n\nit is important to distinguish the current data availability (or the current knowledge about reality) from the data as it was available at a given time (a specific record of knowledge about reality). the latest data reflecting what was known at a given time, is \"immutable\" in that, by definition, cannot change.\n\nfor any regulatory purpose, evidence should be based on data intended as the record of knowledge at a given time. in other words, data used to support regulation should be immutable. this doesnt imply that evidence can never be updated: any update should be considered as a distinct (albeit) related dataset. this is a foundational concept implied by most frameworks e.g., alcoa and fair [16].\n\nthe consequence of this principle is that data used for decision making should be versioned and unaltered within any given version.\n\n# 5.5. data vs information\n\nin its strictest definition, data represent facts or observations that are unprocessed (e.g., as generated by an instrument) while information represents insights originating from such data, once they are understood and processed in their context (e.g., a patients response to a treatment as opposed to a set of individual readouts).\n\nthis framework focuses on evidence generation that can be provided for decision making and as such it goes beyond a distinction between data and information.\n\n# 5.6. frame of reference (validation vs verification)\n\nsome aspects of dq can be measured in respect to different references, contained within the same dataset, or existing beyond the scope of the dataset either as a generic reference or external gold standard, or as the actual fact in the real world. for instance, the weight of an individual could be verified for quality based on its capture in the data (e.g., as a missing value), based on knowledge of a natural weight range or verified against a recorded weight in a source document.\n\nin some frameworks, the assessment of quality within a dataset is referred to as \"verification\" while the assessment in respect to a source record or external gold standard is referred to as \"validation\". we follow these definitions.\n\nnote: this notion of validation should not be confused with validation as a form of coherence checking, see section 6.3.1.\n\n# 5.7. granularity of data and dq\n\nfor structured data, dq can be typically assessed at different levels of granularity:\n\n- the value level corresponds to a specific data point (e.g., a weight). this is also referred as row level when the focus is on all values relative to the same entity.\n\n9the term entity is used to denote the subject of a set of values. in an information record, for instance about a sample, each value in the record would be expressing the measurement of a variable that relates to the same subject or entity. the entity is typically identified in a record via an identifier.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 13/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "701e3b79-69f7-49f5-8442-dc73f629780e": {"__data__": {"id_": "701e3b79-69f7-49f5-8442-dc73f629780e", "embedding": null, "metadata": {"page_label": "14", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Comprehensive Framework for Assessing Data Quality Dimensions and Metrics in Structured Data within the Context of EU Medicines Regulation\"", "questions_this_excerpt_can_answer": "1. How does the Data Quality Framework (DQF) for EU Medicines Regulation define and differentiate between the levels of data granularity, specifically between the value, variable (column), dataset, and table levels, within the context of structured data?\n\n2. What is the approach of the DQF towards setting acceptance thresholds for data quality metrics, especially in the context of primary data collection versus secondary use of data, within the EU medicines regulatory environment?\n\n3. How does the DQF address the challenge of assessing data quality across multiple sources, especially considering the inconsistencies in the definitions of data quality dimensions across different frameworks, within the context of EU medicines regulation?", "prev_section_summary": "This section from the document titled \"Assessing Data Quality and Immutability in the Context of EU Medicines Regulation: Distinguishing Between Data and Information\" delves into several key topics related to data quality and its implications for EU medicines regulation. The main points include:\n\n1. **Data Immutability**: It emphasizes the importance of distinguishing between the current availability of data and the data as it was available at a specific point in time. Data used for regulatory purposes should be considered immutable, meaning it should remain unaltered once recorded to reflect knowledge at that given time. However, it acknowledges that evidence can be updated, but such updates should be treated as separate datasets.\n\n2. **Data vs. Information**: The document differentiates between data (unprocessed facts or observations) and information (insights derived from data within its context). It highlights the role of both in evidence generation for decision-making, suggesting that the framework extends beyond the simple distinction between data and information.\n\n3. **Validation vs. Verification**: It introduces the concept of validation and verification within the data quality (DQ) framework. Verification refers to the assessment of quality within a dataset, while validation involves comparing data to a source record or external standard. The document clarifies that this notion of validation is distinct from other forms of validation, such as coherence checking.\n\n4. **Granularity of Data and DQ**: The section outlines how data quality can be assessed at different levels of granularity, including the value level (specific data points) and the entity level (all values related to the same subject or entity). It defines an entity as the subject of a set of values, typically identified by an identifier.\n\nOverall, the section provides insights into the principles of data immutability, the distinction between data and information, and the specific definitions and applications of validation and verification within the EU medicines regulation data quality framework.", "excerpt_keywords": "Data Quality Framework, EU Medicines Regulation, Structured Data, Data Granularity, Acceptance Thresholds"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nthe variable level (also referred to as column level) covers a data point for a whole set of individuals (e.g., weight as a variable in a clinical study demographics table). metrics for dq at the value level are often easily extended to the column level, for instance by converting values to a percentage.\n\nthe dataset level covers an overall set of related observations. in some contexts, a further distinction can be made, within a dataset, between parts of dataset that are about similar entities. when such distinction is made, such parts are referred to as table level, as those parts would normally appear in distinct tables.\n\nthe concept of granularity also applies to unstructured data, but the definition of its levels is generally specific to the data type and hence is not addressed in this general framework.\n\nthis dqf will focus on the lowest possible level, i.e., for structured data, the value level. however, some metrics may be defined only at a higher level. for example, the plausibility of a single record of a person with a weight of 300 kg may not trigger a metric violation, but if 80% of the records are above 300 kg, it will.\n\n6. dq dimensions and metrics\n\nthe definition of dq dimensions and metrics rely on the general definition of dimension, metrics, and measures:\n\n- a dimension represents one or more related aspects or features of reality (e.g., for a physical object, its extension, or its durability).\n- a metric represents a way to assess the value of a specific feature (e.g., absolute length measured in meters under some specified circumstances).\n- a measure represents a single instance of a metric (e.g., 2 meters). more measures can be combined to derive more general metrics (e.g., average length).\n\ndq metrics can be defined as indicators that when applied to a data source, can derive an assessment of one of more quality dimensions. a single quality metric can be used as an indicator for more than one dimension as expressed below in the examples for coherence.\n\nfor some metrics, acceptance thresholds (e.g., maximum percentage of missing values) can be defined. in general, and for unintended secondary usages, such thresholds can be defined only depending on the question being asked. however, when data are collected for primary use, or when some well-defined secondary uses are targeted, thresholds may be defined (e.g., minimum/maximum) that apply even at the point of data collection. the quality of data is the sum of several features of data, ranging from their correspondence to reality to their representation. it is useful to categorise such features in dimensions, which is a set of features whose measure reveals independent aspects of dq. in other words, different dimensions answer different distinct dq questions.\n\nseveral data frameworks propose an organisation of dq in dimensions that are similar across frameworks, but often inconsistent in the exact definitions. this complicates a coherent assessment of dq when multiple sources are aggregated.\n\n10 this is typically the case for binary or categorical data.\n11 in general, a dataset in support of a specific question will be comprised of homogenous data (e.g., a specific measurements) or of disparate types of data, which are related in that they measure (directly or indirectly) the same entities. in this sense different parts of a datasets are \"linked\" as they will share references to same entities.\n12 note that measures could be unitless and not necessarily contiguous.\n13 feature is here intended as a synonym of \"aspect\" or \"characteristic\".\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 14/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a7f4646f-2a60-4731-865b-93929dafd19b": {"__data__": {"id_": "a7f4646f-2a60-4731-865b-93929dafd19b", "embedding": null, "metadata": {"page_label": "15", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing Data Quality and Reliability in EU Medicines Regulation: An Analysis of Sub-Dimensions\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework define the reliability dimension in the context of data quality, and what specific question does this dimension aim to answer regarding data's representation of observed reality?\n\n2. What are the sub-dimensions of reliability in the EU medicines regulation data quality framework, and how is accuracy within this context specifically defined in relation to data's reflection of reality?\n\n3. In the context of the EU medicines regulation data quality framework, how is the discrepancy between data and reality quantified or described, especially with the example of measuring a person's weight?", "prev_section_summary": "The section from the document titled \"Comprehensive Framework for Assessing Data Quality Dimensions and Metrics in Structured Data within the Context of EU Medicines Regulation\" discusses several key topics and entities related to the Data Quality Framework (DQF) for EU Medicines Regulation, focusing on structured data. The key points include:\n\n1. **Levels of Data Granularity**: The document differentiates between various levels of data granularity, such as the value level, variable (or column) level, dataset level, and table level. It emphasizes that while metrics for data quality (DQ) at the value level can often be extended to the column level, some metrics may only be defined at a higher level. The document also notes that the concept of granularity applies to unstructured data but is not addressed in this framework due to its specificity to data type.\n\n2. **Data Quality Dimensions and Metrics**: The framework outlines the general definitions of dimensions, metrics, and measures. A dimension represents one or more related aspects of reality, a metric is a way to assess the value of a specific feature, and a measure is a single instance of a metric. DQ metrics are indicators used to assess one or more quality dimensions of a data source, and a single quality metric can indicate more than one dimension.\n\n3. **Acceptance Thresholds for Data Quality Metrics**: The document discusses how acceptance thresholds for DQ metrics can be defined, noting that such thresholds can generally only be set depending on the question being asked. However, for primary data collection or well-defined secondary uses, thresholds may be predefined. The quality of data encompasses several features, from their correspondence to reality to their representation, categorized into dimensions that reveal independent aspects of DQ.\n\n4. **Challenges in Assessing Data Quality Across Multiple Sources**: The document acknowledges the challenge of assessing DQ when aggregating multiple sources, due to inconsistencies in the definitions of DQ dimensions across different frameworks. This complicates the coherent assessment of DQ.\n\n5. **Data Frameworks and Homogenous Data**: It mentions that several data frameworks propose organizing DQ in dimensions that are similar but often inconsistent in definitions. It also touches on the concept of datasets supporting specific questions, which may comprise homogenous data or disparate types of data related in that they measure the same entities.\n\nIn summary, the section provides an overview of how the DQF for EU Medicines Regulation approaches the definition and assessment of data quality at different levels of granularity, the setting of acceptance thresholds for DQ metrics, and the challenges of assessing DQ across multiple sources.", "excerpt_keywords": "data quality, reliability, EU medicines regulation, accuracy, discrepancy"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# figure 4 - dimensions of data quality\n\n|are data sufficient?|extensiveness|\n|---|---|\n|reliability|coherence|\n|are data faithfully representing what its meant to be?|are data analysable?|\n|relevance|timeliness|\n\n# 6.1. reliability\n\nreliability is defined as the dimension that covers how closely the data reflect what they are directly measuring.\n\nthe reliability dimension answers the question: to what degree are data accurate or correctly representing an observed reality? when considering the \"fit for purpose\" definition of quality, reliability covers how correct and true the data are.\n\n# 6.1.1. reliability sub-dimensions\n\ngiven this definition, sub-dimensions can be defined:\n\n- accuracy defined as the amount of discrepancy between data and reality. this definition of accuracy encompasses measures of the amount of wrong information in a dataset (data systematically not reflecting reality) with the formal definition of accuracy in measurements (e.g., the distance between the measurements and the real value). for example, the weight of a person could be incorrect due to a data transcription error, or because a person is measured fully clothed, given a systematic excess weight of 1 to 2 kg.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 15/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "99089c0f-ea61-41a6-9dd7-df98b3d0b03e": {"__data__": {"id_": "99089c0f-ea61-41a6-9dd7-df98b3d0b03e", "embedding": null, "metadata": {"page_label": "16", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing EU Medicines Regulation: A Comprehensive Framework for Precision, Reliability, and Data Quality through Accuracy, Plausibility, and Traceability\"", "questions_this_excerpt_can_answer": "1. How does the document define the concept of \"precision\" within the context of data quality for EU medicines regulation, and can you provide an example that illustrates this definition?\n\n2. What role does \"plausibility\" play in identifying accuracy issues within the data quality framework for EU medicines regulation, and how can it serve as a proxy for error detection?\n\n3. How is \"traceability\" characterized in the EU medicines regulation data quality framework, and why is it considered a critical component of data reliability?", "prev_section_summary": "This section from the document titled \"Assessing Data Quality and Reliability in EU Medicines Regulation: An Analysis of Sub-Dimensions\" focuses on the concept of reliability within the context of data quality in EU medicines regulation. It outlines how reliability is a crucial dimension of data quality, emphasizing the importance of data accurately reflecting the reality they are intended to measure. The section defines reliability as the degree to which data are accurate or correctly represent an observed reality, aligning with the \"fit for purpose\" definition of quality by highlighting the correctness and truthfulness of data.\n\nKey topics discussed include:\n- The definition of reliability in the data quality framework.\n- The question reliability aims to answer regarding data's representation of observed reality.\n- The sub-dimensions of reliability, with a specific focus on accuracy.\n\nAccuracy is specifically defined as the discrepancy between the data and reality, encompassing both the presence of wrong information in a dataset and the formal definition of accuracy in measurements. An example provided to illustrate this concept is the potential inaccuracies in measuring a person's weight, which could arise from data transcription errors or external factors like clothing that systematically adds excess weight.\n\nEntities mentioned include:\n- The EU medicines regulation framework.\n- The reliability dimension and its sub-dimensions, particularly accuracy.\n- The example of measuring a person's weight to explain the concept of accuracy within this framework.\n\nOverall, this section highlights the importance of reliability and accuracy in ensuring that data used in the context of EU medicines regulation faithfully represents the reality it is meant to measure, thereby ensuring the data's quality and utility for regulatory purposes.", "excerpt_keywords": "data quality, EU medicines regulation, precision, plausibility, traceability"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# precision\n\nprecision is defined as the degree of approximation by which data represents reality. for instance, the age of a person could be reported in years or months.\n\n# other dq concepts related to reliability\n\nstrictly related to reliability is the concept of plausibility, defined as the likelihood of some information being true. plausibility can be a proxy to detect errors: when some combination of information is unlikely (or impossible) to happen in the real-world, this reveals accuracy issues. for example, a weight of a person exceeding 300 kg is possible, but the weight of many or all persons in a dataset exceeding that value is implausible (unless the foundational determinants indicate otherwise) and likely revealing some errors in the measurement or the processing of the data. plausibility results from the comparison of a data item to typical or necessary characteristics of the entity it intends to represent and is therefore hard to measure as a pure intrinsic characteristic as it depends on the availability of background knowledge or an external gold standard.\n\ntraceability (also referred to as data lineage or provenance) refers to data presenting the knowledge of how data came to be, what source it originated from, and what processing it went through before appearing in its current form. traceability is a feature of data that falls within reliability in that it connects what is measured with the actual data.\n\n# considerations for reliability\n\nreliability fundamentally depends on the systems and process in place for the primary collection of data and its processing and curation in further phases of the evidence generation process both for primary and secondary use cases.\n\nin the absence of errors, accuracy would not decrease along the data aggregation process. precision may instead decrease when data are harmonised to a common data model (cdm), as this may call for less precise representation than original sources to fit the model.\n\nintrinsic aspects of reliability are hard to measure in a pure data-oriented framework, however plausibility measures can provide a way to detect some classes of errors. reliability is independent from a specific question, though each question, in relation to data, will set a threshold for acceptable reliability.\n\n14this definition of precision encompasses the notion of \"reproducibility of values\" under repeated measurements, in that it captures how \"coarse\" is the correspondence between data and the characteristic it intends to measure.\n\n15\"other concepts\" present relevant aspects of dq that falls within a dimension or that are in common use, but that dont strictly adhere to definition of the dimension provided.\n\n16an example could be a cdm allowing timestamp in seconds where a source may use milliseconds, or a cdm prescribing some terminology, which is less specialised in some areas than the one used in the source.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 16/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3c8beb05-94c5-48fe-ac76-1a130a4c894d": {"__data__": {"id_": "3c8beb05-94c5-48fe-ac76-1a130a4c894d", "embedding": null, "metadata": {"page_label": "17", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing and Enhancing Reliability in Healthcare Data: A Comprehensive Guide to Accuracy, Verification, and Validation Metrics\"", "questions_this_excerpt_can_answer": "1. How does the document \"Assessing and Enhancing Reliability in Healthcare Data: A Comprehensive Guide to Accuracy, Verification, and Validation Metrics\" propose to measure the plausibility of healthcare data in terms of accuracy using atemporal metrics?\n \n2. What examples does the document provide to illustrate the concept of verification in the context of healthcare data quality, specifically regarding the similarity of different types of measurements and the consistency of patient information?\n\n3. In the framework outlined in the document for assessing data quality in EU medicines regulation, how is validation differentiated from verification and plausibility, and what specific examples are given to demonstrate validation metrics in a healthcare setting?", "prev_section_summary": "This section from the document titled \"Enhancing EU Medicines Regulation: A Comprehensive Framework for Precision, Reliability, and Data Quality through Accuracy, Plausibility, and Traceability\" discusses several key concepts related to data quality within the context of EU medicines regulation. The key topics and entities include:\n\n1. **Precision**: Defined as the degree of approximation by which data represents reality, with an example illustrating how the age of a person could be reported in years or months to demonstrate varying levels of precision.\n\n2. **Plausibility**: Described as the likelihood of some information being true, serving as a proxy for detecting errors in data. It emphasizes the importance of comparing data to typical or necessary characteristics of the entity it represents. An example provided is the implausibility of all persons in a dataset weighing over 300 kg, which would likely indicate errors in data measurement or processing.\n\n3. **Traceability**: Characterized as the knowledge of how data came to be, including its origin and the processing it underwent before reaching its current form. Traceability is highlighted as a critical component of data reliability, connecting measured data with actual data.\n\n4. **Reliability**: Discussed in the context of the systems and processes in place for the primary collection of data, its processing, and curation. The section notes that accuracy would not decrease in the absence of errors, but precision might decrease when data are harmonized to a common data model (CDM). It also mentions that reliability is independent of specific questions but each question sets a threshold for acceptable reliability.\n\n5. **Common Data Model (CDM)**: Mentioned as an example where precision may decrease due to the need for less precise representation than original sources to fit the model. It also touches on how a CDM might allow for timestamps in seconds where a source may use milliseconds, or prescribe terminology that is less specialized than the source.\n\nThe section underscores the importance of precision, plausibility, and traceability in ensuring the quality and reliability of data within the framework of EU medicines regulation, highlighting how these concepts interplay to detect errors, ensure accuracy, and maintain the integrity of data used in regulatory processes.", "excerpt_keywords": "data quality, healthcare data, EU medicines regulation, data validation, data verification"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# examples of reliability metrics\n\n|sub-dimension|metric group|abstract metric|reference|example|\n|---|---|---|---|---|\n|plausibility (proxy for accuracy)|atemporal|data values and distributions agree with internal measurements or local knowledge|validation|height and weight are a positive value.|\n|plausibility|counts of unique subjects by treatment are as expected (respect to an applicable gold standard).| | | |\n|verification|oral and axillary temperatures are similar.| | | |\n|verification|serum glucose measurement is similar to finger stick glucose measurement.| | | |\n|verification|the patients sex agrees with sex-specific contexts (pregnancy, prostate cancer).| | | |\n|validation|weight values are similar when taken by separate nurses within the same facility using the same equipment.| | | |\n|hba1c values from hospital and national reference lab are statistically match under the same conditions.| | | | |\n|date of birth value in the ehr is not identical to that in the registry record of the same patient.| | | | |\n\nour examples are limited to accuracy as this is the most common application of plausibility. in theory, plausibility could extend to other dimensions of reliability (e.g., a weight expressed in grams is likely to be imprecise if all values end with three zeros).\n\nofficial address domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000 an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "02677a0f-27fe-4b14-9c1b-536294b7363f": {"__data__": {"id_": "02677a0f-27fe-4b14-9c1b-536294b7363f", "embedding": null, "metadata": {"page_label": "18", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "Title: Comprehensive Metrics for Assessing Data Quality in EU Medicines Regulation: Focusing on Validation, Verification, Temporal Plausibility, Calculated Values, and External Comparisons", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework ensure the accuracy of data when comparing identical variables across dependent databases?\n \n2. What methods are used within the EU medicines regulation data quality framework to validate the temporal plausibility of medical records, such as admission and discharge dates?\n\n3. In what ways does the EU medicines regulation data quality framework utilize external comparators, like insurance data or EMA recommendations, to validate and verify the accuracy and plausibility of medical data, including immunization sequences and outpatient procedure lengths of stay?", "prev_section_summary": "The section provides a detailed overview of a framework for assessing and enhancing the reliability of healthcare data, specifically within the context of EU medicines regulation. It outlines various metrics for evaluating data quality across three main sub-dimensions: plausibility, verification, and validation. \n\n- **Plausibility** is used as a proxy for accuracy, focusing on whether data values and distributions are consistent with internal measurements or local knowledge. Examples include ensuring height and weight are positive values and that counts of unique subjects by treatment align with expected standards.\n\n- **Verification** involves comparing different types of measurements or ensuring consistency within patient information. Examples given include the similarity between oral and axillary temperatures, serum glucose measurements compared to finger stick glucose measurements, and the consistency of the patient's sex with sex-specific contexts.\n\n- **Validation** is differentiated from verification and plausibility by focusing on the consistency of data when measured under similar conditions but by different methods or observers. Examples include the similarity of weight values taken by separate nurses or HbA1c values from different labs matching statistically under the same conditions.\n\nThe document emphasizes that while the examples provided focus on accuracy through plausibility, these concepts could theoretically extend to other dimensions of reliability. It also includes contact information for the European Medicines Agency and a copyright notice for 2023, indicating the document's source and the permission for reproduction provided the source is acknowledged.", "excerpt_keywords": "EU medicines regulation, data quality framework, temporal plausibility, external comparators, validation and verification"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# data quality framework for eu medicines regulation\n\n|sub-dimension|metric group|abstract metric|reference|example|\n|---|---|---|---|---|\n| |two or more dependent databases yield similar values for identical variables (e.g., database 1 abstracted from database 2)| |validation|cancer stage value in the ehr does not correspond with a naaccr code in the tumour registry of the same patient.|\n|calculated data values agree with common knowledge| | |validation|height and weight of a patient resulting in an implausible bmi value of less than 5 propose an inaccurate height, weight, or both.|\n|temporal plausibility|observed or derived values conform to expected temporal properties| |verification|discharge date happens after admission date.|\n|sequence of values that represent state transitions conform to expected properties| | |verification|date of primary vaccine administration precedes that of the booster vaccine administration.|\n|observed or derived values have similar temporal properties across one or more external comparators (gold standard)| | |validation|length of stay for outpatient procedure conforms to insurance data for similar populations (no more than 1 day).|\n|sequences of values that represent state transitions are similar to external comparators (gold standards)| | |validation|immunisation sequences match that of the ema recommendations.|\n|measures of data value density against a time-oriented denominator are expected based on external knowledge| | |validation|count of immunisation per month shows an expected spike outside of flu season.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "1752179a-dfc4-402a-afcd-720710470f36": {"__data__": {"id_": "1752179a-dfc4-402a-afcd-720710470f36", "embedding": null, "metadata": {"page_label": "19", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Comprehensive Evaluation of Data Quality: Analyzing Completeness, Coverage, and Core Concepts in Data Quality Frameworks\"", "questions_this_excerpt_can_answer": "1. How does the document differentiate between the concepts of \"completeness\" and \"coverage\" within the framework of data quality in EU medicines regulation, and what are the specific examples provided to illustrate these differences?\n\n2. What alternative methods does the document suggest for verifying a metric when a gold standard is not available, especially in the context of data quality frameworks for EU medicines regulation?\n\n3. How are the sub-dimensions of extensiveness, such as representativeness and missingness, defined in the document, and what role do they play in assessing the quality of data in the context of EU medicines regulation?", "prev_section_summary": "The section outlines a data quality framework specifically designed for EU medicines regulation, focusing on ensuring the accuracy and plausibility of medical data through various validation and verification methods. Key topics include:\n\n1. **Comparison Across Dependent Databases**: The framework emphasizes the importance of comparing identical variables across two or more dependent databases to ensure similar values, highlighting the need for validation in cases where discrepancies arise, such as cancer stage values not matching across different databases for the same patient.\n\n2. **Calculated Data Values**: It addresses the validation of calculated data values, such as BMI, by comparing them against common knowledge to identify implausible values that suggest inaccuracies in the underlying data.\n\n3. **Temporal Plausibility**: The framework includes metrics for verifying the temporal plausibility of medical records, ensuring that dates and sequences of events, like admission and discharge dates or vaccine administration sequences, conform to expected properties.\n\n4. **External Comparators**: Utilization of external comparators, such as insurance data or EMA recommendations, plays a crucial role in validating and verifying the accuracy and plausibility of data. This includes ensuring that outpatient procedure lengths of stay and immunization sequences align with external standards.\n\n5. **Data Value Density**: It also considers the validation of data value density against a time-oriented denominator, using external knowledge to expect certain patterns, such as spikes in immunization counts outside of flu season.\n\nEntities involved in the framework include:\n- Dependent databases\n- Calculated data values (e.g., BMI)\n- Temporal properties of medical records\n- External comparators (insurance data, EMA recommendations)\n- Measures of data value density\n\nOverall, the framework provides a comprehensive approach to assessing data quality in EU medicines regulation by focusing on validation, verification, temporal plausibility, calculated values, and external comparisons.", "excerpt_keywords": "Data quality, EU medicines regulation, completeness, coverage, extensiveness"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# in case a gold standard is not available, a metric can be \"verified\" based on a comparison with a similar metric from another source. this could be equivalent values for identical measurements from two independent databases. this is not necessarily a \"gold standard\" but may be the best available option. alternative methods might be explored when no \"gold standard\" is available (see 8.6 for more details).\n\n# 6.2. extensiveness\n\nextensiveness is defined as the dimension capturing the amount of data available.\n\nthe extensiveness dimension answers the question, \"how much data do we have\"? when considering the \"fit for purpose\" definition of quality, extensiveness covers how sufficient the data are.\n\n# 6.2.1. sub-dimensions of extensiveness\n\nwhen considering the amount of information available, one can think of expressing this as a percentage relative to the total amount of information that could be available. the distinction between completeness and coverage stems from the definition of the scope of totally available information.\n\n|completeness|measures the amount of information available with respect to the total information that could be available given the capture process and data format. data unavailable in the dataset (either due to systematic reasons such as information available in the data source but not included in the data model, or specific entries that are unavailable for a given field) are called \"missing\". for example, the percentage of non-missing values for a required field (e.g., sex) in a dataset would be a measure for completeness.|\n|---|---|\n|coverage|measures the amount of information available with respect to what exists in the real-world, whether it is inside the capture process and data format or not. coverage may not be easily measured as the total information may not be definable or accessible. an example of coverage is the percentage of a given population (e.g., a country or a specific demographics) available in a dataset. when considering coverage in its relation to evidence generation methods, it is also referred to as observability [18].|\n\n# 6.2.1.1. other dq concepts related to extensiveness\n\ntwo concepts that are often associated to extensiveness are representativeness and missingness. while these concepts describe to a certain extent how much data is available, they are more importantly used to characterise how much data is reflecting reality. representativeness is defined as the data having the same characteristics as the whole it is meant to represent (e.g., whether a set of individuals present in a dataset is representative of a population under study). missingness is meant as the characterisation of what is the impact of incomplete data in respect to coverage of a dataset.\n\nextensiveness combines two typical dimensions found in dqfs: completeness of data coverage and coverage. they are here combined as they both relate to the amount of data available.\n\nthere is a fundamental distinction between missing data that are known to exist (e.g., the date of birth of a patient), or missing data whose existence is unknown (e.g., the presence of co-pathologies). quite often in a data model the definition of a variable as \"required \"implies that the relative values are known to exist, and therefore when such data is missing it is a \"missing known\". in general, in the absence of explicit negation, it may not be possible to distinguish\" missing known\" from\" missing unknowns\". when some data point is expected to be captured, the inability to distinguish\" missing knowns\" from\" missing unknowns\" is an issue of reliability (one is unable to assesses if data corresponds to the reality it is meant to represent).\n\nofficial address domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000 an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c211e7cf-ab76-46f0-83d1-903c0cba2c53": {"__data__": {"id_": "c211e7cf-ab76-46f0-83d1-903c0cba2c53", "embedding": null, "metadata": {"page_label": "20", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Comprehensive Evaluation of Dataset Quality: Assessing Completeness, Coverage, and Coherence\"", "questions_this_excerpt_can_answer": "1. How does the document define the process of assessing the completeness of data within the context of EU medicines regulation, and what specific metrics are recommended for evaluating the presence of data in a dataset relative to a predefined data model?\n\n2. What examples does the document provide to illustrate the application of metrics for assessing the extensiveness of data, specifically in terms of completeness and coverage, within the pharmaceutical or healthcare sector?\n\n3. In the context of EU medicines regulation, how is coherence (or consistency) of a dataset defined, and what are the key considerations for ensuring that different parts of a dataset are consistent in their representation and meaning, especially when analyzing the dataset as a whole?", "prev_section_summary": "This section from the document titled \"Comprehensive Evaluation of Data Quality: Analyzing Completeness, Coverage, and Core Concepts in Data Quality Frameworks\" focuses on the aspects of data quality within the framework of EU medicines regulation, specifically addressing the concepts of completeness, coverage, and extensiveness in the context of data quality frameworks (DQFs). It outlines methods for verifying data quality metrics in the absence of a gold standard, by comparing metrics with similar ones from other sources. The document defines extensiveness as the dimension that captures the amount of data available, emphasizing its role in determining whether the data are sufficient for their intended purpose.\n\nKey topics include:\n- **Verification of Metrics**: The document discusses alternative methods for verifying metrics when a gold standard is not available, suggesting the use of equivalent values from independent databases as a viable option.\n- **Extensiveness**: Defined as the amount of data available, this dimension is crucial for assessing the sufficiency of data for specific purposes.\n- **Completeness vs. Coverage**: The document differentiates between completeness (the amount of information available relative to what could be available given the capture process and data format) and coverage (the amount of information available relative to what exists in the real world). Examples are provided to illustrate these concepts, such as the percentage of non-missing values for a required field indicating completeness, and the percentage of a given population in a dataset indicating coverage.\n- **Sub-dimensions of Extensiveness**: It introduces representativeness (how much the data reflect the reality they are meant to represent) and missingness (the impact of incomplete data on the coverage of a dataset) as related concepts.\n- **Missing Data**: The document also touches on the distinction between missing data known to exist and missing data whose existence is unknown, highlighting the implications for data reliability.\n\nEntities mentioned include:\n- **European Medicines Agency (EMA)**: The document provides the official address and contact information for the EMA, indicating its relevance to the EU medicines regulation context.\n\nOverall, the section emphasizes the importance of understanding and measuring the completeness, coverage, and extensiveness of data to ensure high-quality data frameworks in the realm of EU medicines regulation.", "excerpt_keywords": "data quality framework, EU medicines regulation, completeness, coverage, coherence"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# considerations for extensiveness\n\nthe extensiveness of the information collected depends on the specification of the data collection process. however, when combining different datasets for secondary use, there is no guarantee about the completeness of the overall dataset. on an intrinsic level, one can resort to metrics to assess the level of completeness of data. metrics that assess how much data are present in a dataset in respect to what could be present in a given data model are fairly simple to compute. metrics that assess how complete the data are with respect to the population they intend to measure, are more complex and may involve the engagement of gold standards. completeness with respect to a schema is easily definable, while coverage depends on some assumptions that can be defined only with respect to a question. thresholds used as acceptance criteria can also be defined with respect to a question (e.g., 80% complete).\n\n# examples of metrics for extensiveness\n\n|sub-dimension|metric group|abstract rule|reference|example|\n|---|---|---|---|---|\n|completeness|missing values|missing values with respect to a local schema - over time|verification|breed or sex of the animal should not be null.|\n| | |missing values with respect to a local schema - single time|verification|the encounter id variable has missing values.|\n| |estimated missing values|missing values with respect to common expectations|verification|sudden drop of diagnosis codes due to a defective feed from a claim clearing house vendor.|\n| | |relative assessment of missing values with respect to a trusted source of knowledge|validation|the current encounter id variable is missing twice as many values as the institutionally validated database. a drop in icd-9cm codes upon implementation of icd-10-cm.|\n|coverage| |coverage of a population|verification|the percentage of a target population present in a database.|\n\n# coherence\n\ncoherence (also referred to as consistency) is defined as the dimension that expresses how different parts of an overall dataset are consistent in their representation and meaning.\n\nthe coherence dimension answers the questions: is the dataset analysable as a \"whole\" or are additional steps needed like linkage of multiple datasets? is the format of values (e.g., dates) the same across the dataset? is the precision of values the same (e.g., age always approximated to\n\nconsistency and coherence can be considered largely synonymous, with the caveat that detection of inconsistencies is often a way to measure the reliability of data.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 20/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "080c3d72-2eac-4732-bbfa-bb174a853d20": {"__data__": {"id_": "080c3d72-2eac-4732-bbfa-bb174a853d20", "embedding": null, "metadata": {"page_label": "21", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing the Multifaceted Aspects of Coherence in Data Quality within European Union Medicines Regulation\"", "questions_this_excerpt_can_answer": "1. How does the document define and differentiate between the sub-dimensions of coherence within the context of data quality for EU medicines regulation, specifically regarding format coherence, structural or relational coherence, and semantic coherence?\n\n2. What role does uniqueness play in the coherence dimension of data quality according to the EU medicines regulation framework, and how does it interact with the concept of redundancy in improving other dimensions such as reliability?\n\n3. How does the document address the improvement of data coherence in the context of EU medicines regulation, particularly in terms of data standardization processing steps and the challenges of achieving coherence for data aggregated and repurposed for secondary usage?", "prev_section_summary": "The section from the document titled \"Comprehensive Evaluation of Dataset Quality: Assessing Completeness, Coverage, and Coherence\" focuses on the evaluation of dataset quality within the context of EU medicines regulation, emphasizing the importance of assessing the extensiveness of data. It outlines the process and metrics for evaluating data completeness, coverage, and coherence, which are crucial for ensuring the reliability and usability of datasets in the pharmaceutical or healthcare sector.\n\nKey topics include:\n\n1. **Considerations for Extensiveness**: This part discusses the dependency of data extensiveness on the data collection process specification and the challenges in ensuring dataset completeness when combining datasets for secondary use. It introduces metrics for assessing data completeness both in relation to a predefined data model and the population intended to be measured.\n\n2. **Metrics for Assessing Extensiveness**: The document provides examples of metrics for evaluating data extensiveness, particularly focusing on completeness and coverage. It categorizes metrics into sub-dimensions such as missing values and estimated missing values, providing examples and verification/validation methods for each. Coverage metrics assess the representation of a target population within a database.\n\n3. **Coherence**: Defined as the consistency of different parts of a dataset in terms of representation and meaning, coherence is essential for analyzing a dataset as a whole. The section highlights the importance of format and precision consistency across the dataset and suggests that coherence and consistency are synonymous to some extent, with inconsistency detection serving as a method for assessing data reliability.\n\nEntities mentioned include:\n- **Data Quality Framework for EU Medicines Regulation**: The broader context in which these considerations and metrics are applied.\n- **Metrics and Examples**: Specific metrics (e.g., missing values, estimated missing values, coverage of a population) and examples (e.g., breed or sex of an animal should not be null, sudden drop of diagnosis codes) illustrate the application of these metrics in real-world scenarios.\n- **EMA/326985/2023**: The reference number associated with the data quality framework document.\n\nOverall, the section emphasizes the critical role of assessing completeness, coverage, and coherence in evaluating dataset quality, providing a structured approach and examples to guide the assessment process within the EU medicines regulation context.", "excerpt_keywords": "data quality, EU medicines regulation, coherence, data standardization, semantic coherence"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# years)? are references to entities consistent so that information about the same entity is properly\n\"linked\" across parts of the dataset?\n\nwhen considering the \"fit for purpose\" definition of quality, coherence relates to the analysability of\ndata.\n\n# 6.3.1. sub-dimensions of coherence\n\ncoherence is a complex and nuanced dimension that includes the following sub-dimensions:\n\n|format coherence:|whether data are expressed in the same way throughout a dataset (e.g., a dataset mixing dates represented as dd-mm-yyyy and mm-dd-yyyy will not be suitable for an integrated analysis).|\n|---|---|\n|structural or relational coherence:|whether the same entities are identified in the same way throughout a dataset. a sub-aspect of structural coherence is that references are resolved to the correct entities (e.g., a sample annotation table with refer to the correct value in a result table).|\n|semantic coherence:|whether the same value mean the same thing throughout a dataset. for instance, whether \"anuria\" means a condition of total cessation of urine production or the measurement of the amount of urine, or whether the same notion of a measure is intended to have the same precision throughout a dataset.|\n|uniqueness:|uniqueness is the property that the same information is not duplicated but appears in the dataset once. this problem is typical for data aggregated from different sources. note that data with some redundancy will score lower in the uniqueness dimension, but those extra records could help improving other dimensions, such as reliability.|\n|other dq concepts related to coherence:|conformance when this is defined with respect to a specific reference or data model. conformance may practically be the best way to assess coherence, and it also specialised as format, structural and semantic conformance. as an example, conformance would assess if the representation of data is coherent by assessing if it is the same as an overall target standard (e.g.: dd-mm-yyyy). validity is a narrower case of conformance that is defined when the reference model is specific to the dataset being assessed. as an example, if a file is associated to a schema specifying that all dates in the d.o.b. filed should be in dd-mm-yyyy format, the file could be directly assessed as valid or not.|\n\n# 6.3.2. considerations for coherence\n\ncoherence of data at source largely depends on foundational determinants such as the synchronisation of processes and systems across an organisation generating data, or when multiple data are aggregated on the commitment of such organisation(s) to the use of internal or external data standards. by extension, coherence for data aggregated and repurposed for secondary usage depends on the availability of shared standards and reference data. the intrinsic aspects of coherence of a dataset can be improved, largely within a data standardisation processing step. however, improving coherence involves approximating or clarifying the meaning of data. access to the source system and\n\nstructural and relational coherence as synonyms here. it may be the case that these two concepts are distinct or non-tabular data. this distinction will be addressed if the need arises, in extensions of this framework to specific data types.\n\nit is worth noting that\" information\" is distinct from data. two distinct measurements resulting in the same data would not constitute duplicate information (and such measurements would most likely differ in value, when metadata is included). whereas the same measurement reported two times would amount to a duplication.\n\na file could be coherent, but not conformant, if all values are coherent (e.g.: mm-dd-yyyy) while an overall target standard proposed to assess conformance requires dd-mm-yyyy.\n\nas noted in 5.4, this is a different meaning (in common use) then what is defined for\" validation\".\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 21/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4b4b0343-0672-4905-bc5c-aa64b80b2052": {"__data__": {"id_": "4b4b0343-0672-4905-bc5c-aa64b80b2052", "embedding": null, "metadata": {"page_label": "22", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Evaluating Semantic Coherence and Data Quality within the European Union's Medicinal Regulatory Framework\"", "questions_this_excerpt_can_answer": "Based on the provided excerpt and contextual information from the document titled \"Evaluating Semantic Coherence and Data Quality within the European Union's Medicinal Regulatory Framework,\" here are three specific questions that the context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **What are the challenges in assessing semantic coherence within the EU's medicinal regulatory framework?**\n - The excerpt mentions that some aspects of semantic coherence may be difficult to assess with a metric, indicating that the document likely discusses specific challenges or limitations in evaluating semantic coherence in the context of EU medicines regulation. This question targets the document's insights into the complexities of ensuring semantic coherence, which is crucial for regulatory compliance and effective communication within the pharmaceutical sector.\n\n2. **How does the European Medicines Agency (EMA) propose to assess data quality in the context of EU medicines regulation, as per the document dated 2023?**\n - Given the reference to a \"data quality framework for EU medicines regulation\" and the specific EMA document number (ema/326985/2023), this question seeks detailed information on the methodologies, criteria, or guidelines proposed or established by the EMA for assessing data quality. This is particularly relevant for stakeholders in the pharmaceutical industry looking to align with regulatory expectations.\n\n3. **In what ways does the document suggest handling the assessment of data quality and semantic coherence when metrics are not applicable?**\n - The excerpt hints at alternative approaches to assess semantic coherence and data quality beyond conventional metrics, especially in cases where metrics fall short. This question aims to uncover specific strategies, examples, or analytical methods recommended by the document for such scenarios, which could be invaluable for regulatory affairs professionals and data managers in the pharmaceutical industry seeking to ensure compliance and enhance the quality of regulatory submissions.\n\nThese questions are designed to extract valuable insights from the document that are directly relevant to regulatory compliance, data management, and the overall quality of medicinal product information within the EU's regulatory framework.", "prev_section_summary": "This section from the document titled \"Assessing the Multifaceted Aspects of Coherence in Data Quality within European Union Medicines Regulation\" delves into the concept of coherence as a critical dimension of data quality, particularly in the context of EU medicines regulation. It outlines the sub-dimensions of coherence, which include format coherence, structural or relational coherence, semantic coherence, and uniqueness. Each sub-dimension addresses different aspects of how data should be consistently represented, identified, and understood across a dataset to ensure its quality and reliability for analysis and decision-making.\n\n- **Format Coherence** is about ensuring data are expressed uniformly across a dataset, such as using consistent date formats.\n- **Structural or Relational Coherence** focuses on the consistent identification of entities throughout a dataset and the correct resolution of references.\n- **Semantic Coherence** concerns the consistent meaning of values across a dataset, ensuring that the same terms or measures carry the same meaning everywhere they appear.\n- **Uniqueness** emphasizes the importance of having each piece of information appear only once in a dataset to avoid duplication, noting that some redundancy might be beneficial for enhancing other dimensions like reliability.\n\nThe document also discusses related data quality (DQ) concepts such as conformance and validity, which are linked to coherence. Conformance is about aligning with specific references or data models, while validity is a more narrow case of conformance specific to the dataset being assessed.\n\nFor improving data coherence, the document highlights the importance of foundational determinants like the synchronization of processes and systems, the commitment to using internal or external data standards, and the challenges in achieving coherence for data aggregated and repurposed for secondary usage. It suggests that data standardization processing steps can largely improve coherence by clarifying the meaning of data.\n\nFurthermore, the section makes a distinction between \"information\" and \"data,\" noting that duplicate measurements do not constitute duplicate information if they result in different data values. It also clarifies that a dataset could be coherent without being conformant to an overall target standard, illustrating this with an example related to date formats.\n\nThis detailed exploration of coherence within the EU medicines regulation data quality framework provides a comprehensive understanding of how coherence impacts the analysability and reliability of data, and the measures necessary to enhance coherence in datasets used within this regulatory context.", "excerpt_keywords": "semantic coherence, data quality framework, European Medicines Agency, EU medicines regulation, data standardization"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nprocesses is often required for clarifications as an example. some aspects of semantic coherence may be difficult to assess with a metric and hence can only be assessed with respect to a specific question and analysis strategy.\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023 page 22/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "a7b8fdb5-4556-4ef4-879f-cb1b5c6ef740": {"__data__": {"id_": "a7b8fdb5-4556-4ef4-879f-cb1b5c6ef740", "embedding": null, "metadata": {"page_label": "23", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Guidelines for Ensuring Data Integrity and Conformance in Submissions to the European Medicines Agency: Contact Information and Compliance Requirements\"", "questions_this_excerpt_can_answer": "1. What specific examples of data quality metrics are outlined in the \"Guidelines for Ensuring Data Integrity and Conformance in Submissions to the European Medicines Agency\" for ensuring the integrity of data submitted to the European Medicines Agency (EMA)?\n\n2. How does the document \"Guidelines for Ensuring Data Integrity and Conformance in Submissions to the European Medicines Agency\" address the issue of ensuring that data values conform to allowable values or ranges, particularly in the context of animal sex identification in pharmaceutical data submissions?\n\n3. What are the specific guidelines provided by the European Medicines Agency in the document for maintaining relational coherence, particularly regarding the uniqueness of medical record numbers and their linkage across different tables in data submissions?", "prev_section_summary": "The section from the document titled \"Evaluating Semantic Coherence and Data Quality within the European Union's Medicinal Regulatory Framework\" touches upon two main topics: the challenges of assessing semantic coherence and the framework for assessing data quality in the context of EU medicines regulation. \n\n1. **Semantic Coherence**: It highlights the difficulty in evaluating semantic coherence through quantitative metrics alone, suggesting that such assessments often require a more nuanced approach tailored to specific questions and analysis strategies. This implies a need for qualitative assessments or alternative methods when conventional metrics are not applicable, emphasizing the complexity of ensuring semantic coherence in regulatory contexts.\n\n2. **Data Quality Framework**: The document references a specific EMA document (ema/326985/2023) that presumably outlines a framework for assessing data quality within EU medicines regulation. This suggests that the European Medicines Agency has developed guidelines or criteria for evaluating the quality of data in regulatory submissions, which is crucial for maintaining the integrity and reliability of medicinal product information.\n\nThe key entities mentioned include:\n- The European Medicines Agency (EMA), which is implied to play a significant role in establishing the criteria or methodologies for data quality assessment.\n- The document itself, which serves as a resource for understanding the challenges and proposed solutions for assessing semantic coherence and data quality within the EU's medicinal regulatory framework.\n\nThis section is particularly relevant for professionals in regulatory affairs, data management, and anyone involved in the pharmaceutical industry within the EU, providing insights into the complexities of regulatory compliance and the importance of high-quality data submissions.", "excerpt_keywords": "data integrity, European Medicines Agency, data quality framework, pharmaceutical data submissions, relational coherence"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n|sub-dimension|metric group|abstract rule|reference|example|\n|---|---|---|---|---|\n|format coherence (conformance)|syntactic constraints|data values conform to internal formatting constraints|verification|sex is only one ascii character.|\n|allowed values|data values conform to allowable values or ranges| |verification|sex for the animal only has values \"m\", \"f\", or \"u\".|\n| | |data values conform to the representational constraints based on external standards|validation|values for primary language conform to iso standards.|\n|relational coherence (conformance)|reference coherence|data values conform to relational constraints|verification|patient medical record number links to other tables as expected.|\n| |unique (key) data values are not duplicated| |verification|a medical record number is assigned to a single patient.|\n| |schema coherence|changes to the data model or data model versioning|verification|version 1 data does not include medical discharge hour.|\n\nofficial address domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000 an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "64488cf0-301f-4740-b07b-bc2fb6a0c61d": {"__data__": {"id_": "64488cf0-301f-4740-b07b-bc2fb6a0c61d", "embedding": null, "metadata": {"page_label": "24", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Ensuring Coherence, Validation, and Uniqueness in EU Medicines Regulation: A Data Quality Framework for Computed and Coded Data\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation data quality framework ensure computational coherence in the context of BMI calculations?\n \n2. What specific examples does the EU medicines regulation data quality framework provide to illustrate the principle of semantic coherence, particularly in the use of code lists and precision of values?\n\n3. In the context of ensuring data uniqueness within the EU medicines regulation data quality framework, how is the issue of a subject being represented with multiple identities addressed, and what example is provided to illustrate this?", "prev_section_summary": "The section provides detailed guidelines from the document titled \"Guidelines for Ensuring Data Integrity and Conformance in Submissions to the European Medicines Agency,\" focusing on ensuring the integrity and conformance of data submitted to the European Medicines Agency (EMA). It outlines specific data quality metrics within the framework of a data quality framework for EU medicines regulation. The key topics covered include:\n\n1. **Format Coherence (Conformance)**: This involves syntactic constraints ensuring that data values adhere to internal formatting constraints, with an example being the requirement for the sex of an animal to be represented as a single ASCII character.\n\n2. **Allowed Values**: This metric ensures that data values conform to allowable values or ranges, such as specifying that the sex for an animal can only have the values \"m\" (male), \"f\" (female), or \"u\" (unknown). It also covers representational constraints based on external standards, like primary language values conforming to ISO standards.\n\n3. **Relational Coherence (Conformance)**: This includes reference coherence, ensuring data values conform to relational constraints (e.g., a patient's medical record number correctly linking to other tables), the uniqueness of key data values to prevent duplication (e.g., a medical record number is unique to a single patient), and schema coherence, which involves adherence to data model or versioning changes (e.g., version 1 data not including medical discharge hour).\n\nThe section also provides contact information for the European Medicines Agency, located at Domenico Scarlattilaan 6, 1083 HS Amsterdam, The Netherlands, and guides on how to send questions or visit the agency.\n\nEntities mentioned include:\n- European Medicines Agency (EMA)\n- ASCII character format for data representation\n- ISO standards for primary language values\n- Data quality metrics such as format coherence, allowed values, and relational coherence\n\nThis summary encapsulates the guidelines for data integrity and conformance as outlined in the document, emphasizing the importance of adhering to specific data quality metrics and standards for submissions to the EMA.", "excerpt_keywords": "BMI calculations, semantic coherence, code lists, data uniqueness, computed values"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n|sub-dimension|metric group|abstract rule|reference|example|\n|---|---|---|---|---|\n|computation al coherence|computation|computed values conform to programming specifications|verification|database calculated and hand calculated bmi (body mass index) values are identical.|\n|validation|computed results based on published algorithms yield values that match validation values provided by external sources| | | |\n|validation|computed bmi percentiles yield identical values compared to test results and values provided by ema.| | | |\n|semantic coherence (conformance)|precision|the precision of values is fitting a target standard|verification|e.g., two decimal digits are used and generally not zero.|\n|semantic coherence|use of code lists is consistent across data|verification|e.g., the level of a meddra coding for an indication doesnt vary across the dataset.| |\n|uniqueness|same subject is represented with the same identity|verification|william smith is also represented as bill smith with the same dob.| |\n|uniqueness|same subject is represented with multiple identities|verification|william smith and william smith appear as separate individuals instead of the same individual.| |\n|validation|william smiths dob id matches with bill smiths dob and id.| | | |\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023 page 24/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4c41cb0b-84ea-4c50-9eb8-a25df47aedc2": {"__data__": {"id_": "4c41cb0b-84ea-4c50-9eb8-a25df47aedc2", "embedding": null, "metadata": {"page_label": "25", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Regulatory Decision Making: A Comprehensive Approach to Assessing Data Quality through Timeliness, Currency, and Relevance\"", "questions_this_excerpt_can_answer": "1. How does the document define the concept of \"timeliness\" in the context of data quality for regulatory decision-making within EU medicines regulation, and what specific aspects does it consider under its sub-dimensions?\n\n2. What are the specific metrics proposed for assessing the \"currency\" aspect of timeliness in a database, according to the document's framework for enhancing regulatory decision-making through data quality?\n\n3. How does the document differentiate between the concepts of \"timeliness\" and \"currency\" in the context of data quality, and what implications does this distinction have for the evaluation of data's relevance to regulatory decision-making in the pharmaceutical sector?", "prev_section_summary": "The section from the document titled \"Ensuring Coherence, Validation, and Uniqueness in EU Medicines Regulation: A Data Quality Framework for Computed and Coded Data\" focuses on the principles and practices for maintaining high data quality within the context of EU medicines regulation. It outlines specific metrics and examples across three main sub-dimensions: computational coherence, semantic coherence (conformance), and uniqueness.\n\n1. **Computational Coherence**: This sub-dimension emphasizes the importance of ensuring that computed values, such as BMI (Body Mass Index), conform to programming specifications and match validation values provided by external sources, including those from the European Medicines Agency (EMA). It highlights the necessity for computed results, like BMI percentiles, to be consistent and verifiable against test results and external validation values.\n\n2. **Semantic Coherence (Conformance)**: This aspect focuses on the precision of values and the consistent use of code lists across data sets. It stresses the importance of maintaining a standard level of precision, exemplified by using two decimal digits for values, and ensuring that the use of code lists, such as MedDRA coding for indications, remains consistent throughout the dataset to avoid semantic discrepancies.\n\n3. **Uniqueness**: The framework addresses the challenge of representing subjects with unique identities to prevent duplication or misrepresentation. It provides examples where a subject known as \"William Smith\" is also referred to as \"Bill Smith,\" highlighting the need for such representations to be linked to the same date of birth (DOB) and identity to maintain data integrity. It also points out the issue of representing the same subject as multiple individuals due to inconsistencies in naming or identification, underscoring the importance of validation processes to match DOBs and IDs for ensuring uniqueness.\n\nThe document, with a reference number EMA/326985/2023, is found on page 24 of 42 and delves into the specifics of ensuring data quality through computational validation, semantic precision, and the uniqueness of data representation in the realm of EU medicines regulation.", "excerpt_keywords": "data quality, timeliness, currency, regulatory decision making, EU medicines regulation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 6.4. timeliness\n\ntimeliness is defined as the availability of data at the right time for regulatory decision making, that in turns entails that data are collected and made available within an acceptable time.\n\nthe timeliness dimension answers the question: are the data reflecting the intended reality at the point of time of its use?\n\nwhen considering the \"fit for purpose\" definition of quality, timeliness covers how closely the data reflect the intended reality, at the time in which it is used.\n\n# 6.4.1. sub-dimensions of timeliness\n\n- currency is a specific aspect of timeliness that considers how fresh the data are (e.g., current, and immediately useful).\n\n# 6.4.1.1. other dq concepts related to timeliness\n\nin the context of this framework lateness, intended as the aspect of data being captured later than asserted, falls in the dimension of reliability (does the data correspond to reality, at the time it intended to measure?).\n\n# 6.4.2. considerations for timeliness\n\ntimeliness is determined by the systems and processes used to collect and make data available.\n\n# 6.4.3. examples of metrics for timeliness\n\n|sub-dimension|metric group|abstract rule|reference|\n|---|---|---|---|\n|currency|n/a|the average time of updates in a database (or timestamp)|verification|\n| | |the last update of a database (or timestamp)|verification|\n\n# 6.5. relevance\n\nfor the purpose of data quality assessment, relevance is defined as the extent to which a dataset presents the data elements useful to answer a given research question. this definition is narrower and more data-focused than the more commonly understood meaning of \"relevance\" (i.e.: relevance of a\n\nwhile timeliness is not further distinguished in this version of this framework, the definition highlights two different aspects of timeliness: respect to the time data is measured (e.g.: delay between measurements of body temperature respect to the onset of fever), and respect to the time data is collected (e.g.: made available in a database).\n\nnote that the lack of currency doesnt imply a lack of timeliness: historic data may lack currency, but still be timely for retrospective studies.\n\nmeasures of currency focus on a narrower aspect of timeliness and are generally based on the time data are actually recorded in a database (rather than the time of data collection).\n\nrelevance as a common term is defined as the degree to which something is related or useful to what is happening, discussed about or for a given objective.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "025167d9-0702-45ef-b606-20ec0eb0a2cd": {"__data__": {"id_": "025167d9-0702-45ef-b606-20ec0eb0a2cd", "embedding": null, "metadata": {"page_label": "26", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Evaluating Relevance and Extensiveness as Key Dimensions of Data Quality for Regulatory Decision Making in EU Medicines Regulation\"", "questions_this_excerpt_can_answer": "Based on the detailed context provided from the document excerpt on the data quality framework for EU medicines regulation, here are three specific questions that this context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **How does the relevance dimension of data quality specifically apply to the evaluation of datasets for regulatory decision-making in EU medicines regulation?**\n - This question targets the unique application of the relevance dimension within the specific regulatory context of EU medicines regulation, focusing on how datasets are evaluated for their ability to address specific research questions using particular methods, which is a nuanced aspect detailed in the provided context.\n\n2. **What are the specific criteria and metrics used to assess the relevance of a dataset in the context of EU medicines regulation, and how do these criteria support regulatory decision-making?**\n - This question seeks detailed information on the operationalization of the relevance dimension in regulatory contexts, including the use of metrics such as the number of variables available versus required in a dataset. The context provided outlines a framework for evaluating data quality that includes relevance among other dimensions, making it a rich source for understanding how these metrics are applied and their importance in regulatory processes.\n\n3. **How does the data quality framework distinguish between the relevance and extensiveness dimensions of data quality, and why is this distinction important for regulatory decision-making in the context of EU medicines regulation?**\n - The distinction between relevance and extensiveness is crucial in regulatory contexts, as it separates the quantity of data from the suitability of data for specific regulatory questions. This question digs into the rationale behind this distinction and its implications for the selection and use of data in making regulatory decisions, a topic directly addressed in the provided context.\n\nThese questions leverage the specific insights and details provided in the excerpt, focusing on the application of data quality dimensions in the regulatory decision-making process for EU medicines regulation. The context offers a unique perspective on how data quality is assessed and utilized in this specific domain, making it a valuable source for answers to these nuanced questions.", "prev_section_summary": "The document section delves into the concept of \"timeliness\" within the context of data quality for regulatory decision-making in EU medicines regulation, emphasizing its importance in ensuring data are available at the right time. Timeliness is defined as the reflection of data's ability to represent the intended reality at the point of its use, aligning with the \"fit for purpose\" quality definition. It encompasses the sub-dimension of \"currency,\" which focuses on the freshness of data, indicating its current and immediate usefulness.\n\nThe document also touches upon other data quality (DQ) concepts related to timeliness, such as \"lateness,\" which is categorized under the dimension of reliability, questioning whether data correspond to reality at the intended time of measurement.\n\nFor assessing timeliness, the document proposes specific metrics, particularly for the currency aspect, including the average time of updates and the timestamp of the last update in a database, both subject to verification.\n\nFurthermore, the document outlines the concept of \"relevance\" in data quality assessment, defining it as the extent to which a dataset contains data elements useful for answering a specific research question. This definition is presented as more focused and data-centric compared to the broader understanding of relevance.\n\nThe distinction between timeliness and currency is highlighted, noting that while currency pertains to the freshness of data in a database, timeliness has a broader application, including the consideration of data's measurement and collection times. The document clarifies that historic data may lack currency but can still be timely for retrospective studies, indicating that measures of currency are a narrower aspect of timeliness based on the actual recording time in a database.", "excerpt_keywords": "data quality, relevance, EU medicines regulation, regulatory decision-making, dataset evaluation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# data source to generate valid evidence informing a specific research question based on the study design\n\nto distinguish these two meanings, this text explicitly makes use of the terms \"relevance dq dimension\" (relevance as here defined), and \"relevance to a question\" (the more generic meaning).\n\nthe relevance dq dimension answers the question: does the dataset present the values (or data elements) that are needed to address a specific question, using a specific method?\n\nwhen considering the \"fit for purpose\" definition of quality, the relevance dq dimension describes how the data cover the aspects of reality that are intended to be measured.\n\nin this framework, relevance to a question is captured by \"question specific determinants\" that apply to all dimensions.\n\nthe dimension previously introduced partition dq aspects on the basis for some driving questions (is data truly representing reality? how much data is there? is data analysable as a whole? is data available at the right time?). a missing question is about what type of data is there, and this is what the \"relevance dq dimension\" is covering.\n\ngiven the context described, relevance can only be characterised in relation to a research question and a data analysis strategy. however, in some cases, it is possible to identify a set of frequently required research questions that can be characterised from the relevance point of view, in the context of a specific type of data source. this is referred to as relevance for a domain, where domain is a shorthand for a research questions domain.\n\n# examples of metrics for relevance\n\n|sub-dimension|metric group|abstract rule|reference|example|\n|---|---|---|---|---|\n|n/a|n/a|the number of variables (columns) available in a given dataset vs the number of required variables.|verification|n/a|\n\n# general recommendations and maturity models\n\nselecting datasets to use in regulatory decision making ultimately requires knowledge of the degree to which such data satisfy the reliability, extensiveness, coherence, timeliness and relevance criteria. such quality dimensions build up along an overall life cycle from generation through processing to aggregation and ultimately analysis, and in such process, data originally gathered for other usages can be repurposed when ethical or legal requirements are met.\n\nthe choice of quality measures and checks varies broadly depending on data types and their intended use. however, it is possible to organise such measures and checks following a coherent structure that helps achieve homogeneity and identify gaps.\n\nthe following tables exemplify how determinants of quality (foundational, intrinsic or question-specific) affect the different quality dimensions for both data and metadata. these tables provide a\n\nthe distinction between extensiveness and relevance can be clarified by the two distinct questions that these dimensions are answering: how much data do we have? (extensiveness) vs what data to we have? (relevance).\n\nby data analysis strategy, the definition of assumptions, decisions, and methods to address a specific question is intended.\n\nthis metric is provided as an example to clarify what pertains to the dimension of relevance. not all variables are equivalent and actual metrics will need to be specified for specific use cases and/or data types.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 26/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "3425d737-af39-45d0-8f04-2b72dde4a339": {"__data__": {"id_": "3425d737-af39-45d0-8f04-2b72dde4a339", "embedding": null, "metadata": {"page_label": "27", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Establishing Comprehensive Data Quality Maturity Models for Enhanced Regulatory Assessment in European Union Medicines Regulation\"", "questions_this_excerpt_can_answer": "1. What are the three distinct maturity models introduced in the Data Quality Framework for EU Medicines Regulation, and how do they relate to the process of regulatory assessment in the context of European Union medicines regulation?\n\n2. How does the Data Quality Framework (DQF) for EU Medicines Regulation propose to evolve the characterisation of data quality (DQ) across different levels of maturity, specifically in terms of process characterisation, intrinsic metrics, and the definition of target questions?\n\n3. Given the abstract nature of the maturity models within the Data Quality Framework for EU Medicines Regulation, what specific guidance does the document offer for implementing these models in practice, especially considering the variability in data types and use cases in the regulatory assessment of medicines?", "prev_section_summary": "This section from the document titled \"Evaluating Relevance and Extensiveness as Key Dimensions of Data Quality for Regulatory Decision Making in EU Medicines Regulation\" focuses on the importance of data quality in the context of EU medicines regulation, specifically addressing the dimensions of relevance and extensiveness. The key topics discussed include:\n\n1. **Relevance Dimension of Data Quality**: It is defined as the suitability of a dataset to address a specific research question using a particular method. This dimension is crucial for ensuring that the data cover the aspects of reality intended to be measured, making it \"fit for purpose.\" The text distinguishes between \"relevance dq dimension\" (specifically defined relevance) and \"relevance to a question\" (a more generic meaning).\n\n2. **Metrics for Assessing Relevance**: An example metric provided is the comparison between the number of variables available in a dataset versus the number of variables required. This metric serves as a concrete measure to assess the relevance of a dataset for a given research question.\n\n3. **General Recommendations and Maturity Models**: The document suggests that selecting datasets for regulatory decision-making involves understanding how well the data meet various quality dimensions, including reliability, extensiveness, coherence, timeliness, and relevance. It emphasizes the importance of organizing quality measures and checks in a coherent structure to achieve homogeneity and identify gaps.\n\n4. **Distinction Between Extensiveness and Relevance**: The document clarifies that extensiveness and relevance answer two different questions: \"How much data do we have?\" and \"What data do we have?\" respectively. This distinction is vital for regulatory decision-making, as it separates the quantity of data from its suitability for specific regulatory questions.\n\n5. **Data Analysis Strategy**: It is mentioned that data analysis strategy involves defining assumptions, decisions, and methods to address a specific question, which is integral to the relevance dimension.\n\nEntities mentioned include:\n- **Relevance dq dimension**\n- **Metrics for relevance**\n- **General recommendations and maturity models**\n- **Extensiveness vs. relevance**\n- **Data analysis strategy**\n\nThe section provides a detailed overview of how the relevance and extensiveness dimensions of data quality are conceptualized and operationalized within the framework of EU medicines regulation, highlighting their importance in the regulatory decision-making process.", "excerpt_keywords": "data quality, maturity models, regulatory assessment, European Union medicines regulation, evidence-generation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nguidance for what metrics and actions apply at which stage of the data life cycle. for example, the dimension of extensiveness is determined exclusively by foundational determinants at production time. further in the data life cycle, data intrinsic measures can only partially assess the degree of reliability (plausibility metrics).\n\nthese tables also form the basis for the development of maturity models for the characterisation of dq for regulatory purposes. the maturity models provide guidance as to how determinants can be characterised in successive levels of maturity. higher maturity levels support the strongest possible evidence in the most efficient way.\n\nthree distinct maturity models are provided, corresponding to the three determinants, to depict how maturity evolves with respect to process characterisation, intrinsic aspects (metrics) and the definition of target questions. these models are meant to apply to the different steps and actions that compose an overall evidence-generation framework.\n\nit should be noted that the maturity models provided are abstract in the sense that they provide the classes or recommendations that need to be complemented with implementation detail for specific data types and use cases.\n\nit takes time to characterise and implement processes to achieve higher maturity levels both for data source holders, but also for regulatory assessors to understand the impact of a higher maturity model. this is also context dependent e.g., disease area, disease frequency, health system etc. the dqf will be updated in the upcoming years with further deep dives in regulatory use cases of particular interest to guide clinical assessment for medicines regulation.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 27/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "64353ddd-3c95-4692-bd21-5bb640eadba2": {"__data__": {"id_": "64353ddd-3c95-4692-bd21-5bb640eadba2", "embedding": null, "metadata": {"page_label": "28", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Guidelines for Assessing Data Reliability and Coherence in Regulatory Contexts: A European Medicines Agency Perspective\"", "questions_this_excerpt_can_answer": "1. How does the European Medicines Agency (EMA) define and assess the reliability of data in the context of EU medicines regulation, particularly in terms of primary and secondary data sources?\n\n2. What specific criteria or dimensions does the EMA consider when evaluating the coherence and extensiveness of data within the regulatory framework for medicines in the EU, and how are these criteria applied to both primary and secondary data?\n\n3. In the guidelines provided by the EMA, how are timeliness and relevance of data determined in the regulatory assessment process, and what measures or processes are in place to ensure these aspects are maintained throughout data collection and analysis?", "prev_section_summary": "The section discusses the Data Quality Framework (DQF) for EU Medicines Regulation, focusing on the establishment of maturity models to enhance regulatory assessment processes. It outlines three distinct maturity models that evolve based on process characterisation, intrinsic metrics, and the definition of target questions, aiming to provide a structured approach to assessing data quality (DQ) in the context of European Union medicines regulation. These models are designed to guide the characterization of data quality determinants across different levels of maturity, emphasizing the importance of foundational determinants at the production stage and the limitations of intrinsic measures in assessing reliability later in the data life cycle.\n\nThe document highlights the abstract nature of these maturity models, noting that they offer a framework that requires further specification and implementation detail tailored to specific data types and regulatory use cases. It acknowledges the challenges and time required for both data source holders and regulatory assessors to achieve and understand the implications of higher maturity levels, which are influenced by various factors such as disease area and health system characteristics.\n\nFurthermore, the excerpt mentions plans to update the DQF with more detailed guidance on regulatory use cases, aiming to support clinical assessment for medicines regulation more effectively. This indicates an ongoing effort to refine and enhance the framework to better meet the needs of regulatory assessment processes within the EU medicines regulation context.", "excerpt_keywords": "Data Quality Framework, European Medicines Agency, regulatory assessment, data reliability, data coherence"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n|determinant/dimension|reliability|extensiveness|coherence|timeliness|relevance|\n|---|---|---|---|---|---|\n|foundational|primary| |primary| |primary|\n| |data collected following established protocols can be sufficient to address regulatory questions.| | |normally guaranteed by the design of the data collection process.| |\n| |primary and secondary data reliability (in all its aspects) results from systems and processes in place for data generation or collection.|primary and secondary the data collection protocol determines what data are collected.|primary and secondary dependent on the orchestration of processes originating data and on the commitment to internal or external data standards.|primary and secondary solely determined by systems and processes.| |\n| |secondary precision may decrease during data transformation and harmonisation processes.|secondary there is no guarantee on the completeness of an integrated dataset or its coverage for a different use case, and this can only be assessed or controlled.|secondary relies on shared standards and reference data. documentation on data generation processes may be needed to enhance coherence.|secondary normally assessed for a specific use or a class of usages when datasets are selected.| |\n|intrinsic|primary and secondary plausibility measures can be used to detect a (limited) class of reliability issues.| |primary and secondary completeness measures based on a data model are easy to implement.|primary and secondary coherence can be measured exclusively based on data (with eventual access to the datasets (e.g., event|primary and secondary some aspects of timeliness may be observed in the datasets (e.g., event relevance of data is not dependent on a dataset itself.|\n\nofficial address domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000 an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "f7f429a3-ab04-4aa0-8e90-988065bb000b": {"__data__": {"id_": "f7f429a3-ab04-4aa0-8e90-988065bb000b", "embedding": null, "metadata": {"page_label": "29", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Assessing Data Quality Across Multiple Dimensions: Principles and Challenges\"", "questions_this_excerpt_can_answer": "1. How does the data quality framework for EU medicines regulation propose to measure the accuracy of data directly, and what challenges are associated with these measures?\n \n2. In the context of EU medicines regulation, how is the concept of coherence defined and assessed across different data sources, and what specific elements are necessary to improve coherence according to the framework?\n\n3. What criteria does the data quality framework suggest for determining the relevance of data in regulatory processes, and how does it relate to the specific questions being asked within the EU medicines regulation context?", "prev_section_summary": "This excerpt from the document titled \"Guidelines for Assessing Data Reliability and Coherence in Regulatory Contexts: A European Medicines Agency Perspective\" outlines the European Medicines Agency (EMA)'s framework for evaluating data quality in the context of EU medicines regulation. The framework is structured around five key dimensions: reliability, extensiveness, coherence, timeliness, and relevance. These dimensions are considered for both primary and secondary data sources.\n\n1. **Reliability**: This dimension is foundational and can be ensured by following established protocols for data collection. The reliability of both primary and secondary data is dependent on the systems and processes in place for data generation or collection. However, secondary data may face issues of decreased precision during transformation and harmonization processes.\n\n2. **Extensiveness**: This refers to the completeness of the data set and its coverage for the intended use case. For primary data, the collection protocol determines what data are collected. Secondary data, however, may lack guarantees on completeness for different use cases, and this aspect can only be assessed or controlled to a certain extent.\n\n3. **Coherence**: This dimension is crucial for both primary and secondary data and depends on the orchestration of processes originating data and adherence to internal or external data standards. Coherence can be enhanced through shared standards, reference data, and documentation on data generation processes.\n\n4. **Timeliness**: This aspect is normally guaranteed by the design of the data collection process and is assessed for specific uses or classes of usages when datasets are selected. Some aspects of timeliness can be observed directly in the datasets.\n\n5. **Relevance**: The relevance of data is determined by the systems and processes in place and is not dependent on the dataset itself. It is a critical factor in ensuring that the data collected and analyzed meet the regulatory needs.\n\nThe document emphasizes the importance of these dimensions in assessing data quality within the regulatory framework for medicines in the EU. It highlights the need for established protocols, systems, and processes to ensure the reliability, extensiveness, coherence, timeliness, and relevance of data used in regulatory assessments.", "excerpt_keywords": "data quality, EU medicines regulation, coherence, reliability, relevance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n|dimension|reliability|extensiveness|coherence|timeliness|relevance|\n|---|---|---|---|---|---|\n|direct measures of accuracy require access to the source of data.|secondary coverage measures are more complex and may require confrontation to a golden standard.|secondary coherence can be largely improved based solely on a dataset and data-independent elements (e.g., mapping to a common standard). a full resolution of coherence may require access to additional information on processes. coherence needs to be assessed every time a new data source is \"integrated\".|coherence can be largely improved based solely on a dataset and data-independent elements (e.g., mapping to a common standard). a full resolution of coherence may require access to additional information on processes. coherence needs to be assessed every time a new data source is \"integrated\".|coherence can be largely improved based solely on a dataset and data-independent elements (e.g., mapping to a common standard). a full resolution of coherence may require access to additional information on processes. coherence needs to be assessed every time a new data source is \"integrated\".|dates to determine currency. a dataset itself cannot in general reveal how current its information is.|\n|primary processes and systems to collect data are usually designed to answer a specific question and to meet the required targets, across dq dimensions, that such target entails.|secondary threshold for acceptable reliability can be defined only respect to a specific question and method.|secondary coverage and completeness depend on a question: metrics can be defined only respect to a specific question and method, or for a domain. for completeness, typically a question would determine a set of acceptance thresholds and general metrics.|some assessment of semantic coherence (data distribution coherence or abstraction coherence) may only be measured respect to a specific question and method.|acceptable timeliness depends on the question and its broader regulatory usage (e.g., approval vs monitoring).|relevance can only be determined in relation to one or more questions.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "e2da286f-7610-4983-824d-17f34d24f0db": {"__data__": {"id_": "e2da286f-7610-4983-824d-17f34d24f0db", "embedding": null, "metadata": {"page_label": "30", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Improving Data Integrity in European Medicines Regulation: Strategies for Ensuring Metadata Reliability, Coherence, and Timeliness\"", "questions_this_excerpt_can_answer": "1. How does the EU medicines regulation framework define the importance of metadata reliability, extensiveness, coherence, and timeliness in the context of data quality for both primary and secondary data?\n \n2. What specific strategies does the EU medicines regulation document propose for ensuring the timeliness of metadata, especially when data are repurposed and used across different systems?\n\n3. In the context of the EU medicines regulation, how are intrinsic measures for metadata data quality (meta DQ) differentiated from those for data quality, particularly in terms of completeness and the presence of missing fields?", "prev_section_summary": "The excerpt from the document titled \"Assessing Data Quality Across Multiple Dimensions: Principles and Challenges\" focuses on the data quality framework for EU medicines regulation, emphasizing the measurement and assessment of data across various dimensions including reliability, extensiveness, coherence, timeliness, and relevance. Key topics discussed include:\n\n1. **Accuracy**: The framework suggests that direct measures of accuracy necessitate access to the original data source. It highlights the complexity of secondary coverage measures, which may require comparison to a \"golden standard\" to assess accuracy effectively.\n\n2. **Coherence**: The document underscores the importance of coherence, which can be significantly enhanced through dataset-specific and data-independent elements, such as mapping to a common standard. Achieving full coherence might necessitate additional information about processes, and it should be evaluated whenever a new data source is integrated.\n\n3. **Relevance**: The relevance of data is determined by its relation to specific questions within the regulatory context. The framework indicates that primary processes and systems for data collection are typically designed with a particular question in mind, aiming to meet the required targets across different data quality (DQ) dimensions.\n\n4. **Timeliness and Extensiveness**: Timeliness is defined by the currency of the data, which cannot generally be determined from the dataset itself but rather through the dates it contains. Extensiveness, while not explicitly detailed in the excerpt, is implied to be related to the coverage and completeness of data, which vary based on the specific questions and methods applied.\n\nThe document also points out that the threshold for acceptable reliability, coverage, completeness, and semantic coherence (including data distribution and abstraction coherence) is question-specific and method-dependent. It emphasizes that these dimensions of data quality must be tailored to the specific regulatory questions and contexts within the EU medicines regulation framework.", "excerpt_keywords": "metadata reliability, data quality framework, EU medicines regulation, metadata coherence, metadata timeliness"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n|determinant/dimension|reliability|extensiveness|coherence|timeliness|relevance|\n|---|---|---|---|---|---|\n|foundational|primary|primary|primary|primary|primary|\n| |for primary data, the extensiveness of metadata can be characterised at source.|for primary data, the extensiveness of metadata can be characterised at source.|metadata coherence relies on the presence of common standards and terminologies.|normally guaranteed by the design of data collection process.| |\n| |primary and secondary|reliability of metadata relies on the processes to collect it, along the whole data processing chain.| | | |\n| |one key aspect to ensure reliability is to capture metadata as close to the source as possible.| | | | |\n| |secondary|for secondary data, coherence relies on the presence of widely agreed standards and shared resources such as ontologies or reference data services.| | | |\n| | |when data are repurposed and used in different systems, timeliness of metadata should be enforced by design (metadata should be in synch with the data).| | | |\n|intrinsic|primary and secondary|primary and secondary|primary and secondary|primary and secondary|primary and secondary|\n| |some metadata (e.g., summary statistics) can be generated from a dataset.| | | | |\n| |when data and metadata are considered as whole, traceability can also be assessed by intrinsic measures.| | | | |\n| | |intrinsic measures for meta dq mimic the ones for data (e.g., completeness and missing fields).|unlike data, metadata assessment may not require references to| | |\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023\n\npage 30/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "153dbd8b-c3db-440c-94ab-dc317da7c899": {"__data__": {"id_": "153dbd8b-c3db-440c-94ab-dc317da7c899", "embedding": null, "metadata": {"page_label": "31", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Data Quality and Compliance in EU Medicines Regulation through Metadata Standards and Frameworks\"", "questions_this_excerpt_can_answer": "Based on the provided excerpt from the document titled \"Enhancing Data Quality and Compliance in EU Medicines Regulation through Metadata Standards and Frameworks,\" here are three questions that this context can provide specific answers to, which are unlikely to be found elsewhere:\n\n1. **How does the EU medicines regulation data quality framework address the relevance and reliability of metadata in relation to specific regulatory questions?**\n - This question is pertinent because the excerpt discusses how metadata requirements are designed for specific questions and how the relevance and reliability of metadata are considered independently of the coherence or timeliness of a specific question. The document seems to offer a unique perspective on tailoring metadata requirements to enhance data quality in regulatory contexts, making it a valuable source for understanding these processes.\n\n2. **What principles guide the characterization and necessity of metadata within the EU medicines regulation data quality framework?**\n - The excerpt hints at a nuanced approach to determining what metadata are necessary, depending on the question or range of questions at hand. This question seeks to uncover the underlying principles or criteria used within the framework to evaluate the characterisation and necessity of metadata, which appears to be a specific focus of the document not commonly detailed in other sources.\n\n3. **In what ways does the EU medicines regulation data quality framework ensure the independence of metadata quality from specific regulatory questions?**\n - Given that the excerpt mentions the independence of metadata quality from specific questions, this question aims to explore the mechanisms or standards implemented to maintain metadata quality irrespective of the regulatory context. This aspect of the framework is likely detailed uniquely in this document, providing insights into how metadata standards are applied consistently across varying regulatory scenarios.\n\nThese questions are designed to delve into the specific insights and methodologies outlined in the document regarding the handling, relevance, and quality of metadata in the context of EU medicines regulation, leveraging the unique focus of the provided excerpt.", "prev_section_summary": "This section from the document titled \"Improving Data Integrity in European Medicines Regulation: Strategies for Ensuring Metadata Reliability, Coherence, and Timeliness\" focuses on the data quality framework for EU medicines regulation, specifically addressing the importance and strategies for ensuring the quality of metadata in the context of data integrity. The key topics discussed include the determinant dimensions of metadata quality such as reliability, extensiveness, coherence, timeliness, and relevance, with a particular emphasis on primary and secondary data.\n\nThe document outlines that for primary data, the extensiveness of metadata is crucial and should be characterized at the source. It emphasizes that metadata reliability depends on the collection processes throughout the data processing chain and highlights the importance of capturing metadata as close to the source as possible to ensure its reliability. For secondary data, coherence is underscored as relying on widely agreed standards and shared resources like ontologies or reference data services. The document also points out that when data are repurposed for use in different systems, the timeliness of metadata must be enforced by design to keep it synchronized with the data.\n\nFurthermore, the section introduces the concept of intrinsic measures for metadata data quality (meta DQ), which mimic those for data quality, such as completeness and the absence of missing fields. It suggests that intrinsic measures can help assess traceability when data and metadata are considered as a whole.\n\nEntities mentioned include:\n- EU medicines regulation framework\n- Primary and secondary data\n- Metadata quality dimensions: reliability, extensiveness, coherence, timeliness, relevance\n- Standards and shared resources like ontologies or reference data services\n- Intrinsic measures for metadata data quality (meta DQ)\n\nThe document reference is EMA/326985/2023, found on page 30 of 42 in the PDF titled \"Improving Data Integrity in European Medicines Regulation: Strategies for Ensuring Metadata Reliability, Coherence, and Timeliness.\"", "excerpt_keywords": "metadata standards, EU medicines regulation, data quality framework, metadata reliability, regulatory compliance"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# golden standards\n\n(e.g., missing metadata values is not related to sampling of a question specific primary population).\n\nmetadata requirements are designed for a specific question and are normally sufficient to address it. primary and secondary primary and primary and metadata should be in general secondary secondary reliable independently of a the coherence of timeliness of specific question (not all metadata collected may be metadata is metadata are relevant for all questions). secondary independent from a specific question. secondary the relevance of characterisation of metadata is purely what metadata are dependent on a necessary is question (or range of ultimately dependent on a questions). question (or set of typical questions)\n\n# data quality framework for eu medicines regulation\n\nema/326985/2023 page 31/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "dd431761-d388-47ef-a8e9-effeb06ba30a": {"__data__": {"id_": "dd431761-d388-47ef-a8e9-effeb06ba30a", "embedding": null, "metadata": {"page_label": "32", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Data Quality in the European Medicines Agency through Documentation and FAIR Principles: A Guide to Developing Maturity Models\"", "questions_this_excerpt_can_answer": "1. What are the foundational determinants necessary for managing data quality (DQ) in the context of EU medicines regulation, and how are they characterized within maturity models?\n \n2. How does the document recommend implementing FAIR principles in the development of maturity models for enhancing data quality in the European Medicines Agency?\n\n3. What specific documentation requirements are outlined for Level 1 maturity in ensuring data adequacy for decision-making within the European Medicines Agency's data quality framework?", "prev_section_summary": "The section from the document titled \"Enhancing Data Quality and Compliance in EU Medicines Regulation through Metadata Standards and Frameworks\" focuses on the principles and standards related to metadata within the EU medicines regulation data quality framework. Key topics include:\n\n1. **Golden Standards for Metadata**: It emphasizes that metadata requirements are tailored to specific regulatory questions, ensuring that the metadata collected are sufficient and reliable for addressing those questions. It also highlights the independence of metadata quality from the coherence or timeliness of the specific regulatory question, suggesting that not all metadata collected may be relevant for every question. This indicates a nuanced approach to metadata collection and evaluation, where the relevance and reliability of metadata are considered crucial.\n\n2. **Characterization and Necessity of Metadata**: The document discusses how the necessity and characterization of metadata are dependent on the specific questions or range of questions being addressed. This suggests a flexible and question-oriented approach to determining what metadata are essential, underlining the importance of tailoring metadata requirements to the specific needs of regulatory questions.\n\n3. **Independence of Metadata Quality**: A significant point made is the independence of metadata quality from specific regulatory questions. This means that the standards for metadata quality are maintained consistently, regardless of the particular context or question at hand, ensuring a high level of data integrity and reliability across different regulatory scenarios.\n\nEntities mentioned include:\n- **EMA (European Medicines Agency)**: Indicated by the reference \"ema/326985/2023,\" suggesting that this document or framework is associated with or endorsed by the EMA.\n- **EU Medicines Regulation**: The broader regulatory context within which this data quality framework is applied, focusing on enhancing data quality and compliance through specific standards and frameworks for metadata.\n\nOverall, the section outlines a sophisticated approach to managing metadata within the EU medicines regulation context, emphasizing the importance of specificity, reliability, and independence in ensuring high-quality data for regulatory purposes.", "excerpt_keywords": "data quality, FAIR principles, maturity models, European Medicines Agency, documentation requirements"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# 7.1. foundational determinants: recommendation and maturity levels\n\na characterisation of the systems and processes underpinning data generation and manipulation (foundational determinants) is necessary to manage dq. below is a set of defined maturity levels, each providing a progressive hierarchy of recommendations for the characterisation of foundational determinants, with the intention to chart a direction of improvement towards adequate and efficient characterisation of these aspects of dq (see figure 5). it is recommended that fair principles [16] for data and metadata be implemented as early as possible, or partially, along maturity models.\n\n|foundational characterisation of determinants for the development of the maturity models|\n|---|\n|automated|feedback|question-[|\n|implemented|automated|ad-hoc (domain-defined|\n|documented|formalised|metedata|standardised|\n\nfigure 5 - maturity model for data quality determinants\n\n# 7.1.1. level 1: documented\n\nfor data to be adequate for decision making, at a minimum, the processes that pertain to data generation and manipulation should be documented, true, verifiable (when relevant, this may extend to training procedures) and versioned. this is fundamental and ensures the reliability of any derived information. the documentation should cover determinants for reliability (precision), extensiveness, coherence and, when relevant, timeliness. while some of these determinants depend on a specific question, data collection processes and systems will generally be designed with some generic questions in mind. the provision of documentation for data processing and transformation are also essential to guarantee that reliability is preserved and should be provided for all such processing by different actors along the data life cycle.\n\nofficial address domenico scarlattilaan 6 * 1083 hs amsterdam * the netherlands\n\naddress for visits and deliveries refer to www.ema.europa.eu/how-to-find-us\n\nsend us a question go to www.ema.europa.eu/contact telephone +31 (0)88 781 6000 an agency of the european union\n\n(c) european medicines agency, 2023. reproduction is authorised provided the source is acknowledged.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "bda71a77-2b0c-48e2-b369-fbfa08fd367e": {"__data__": {"id_": "bda71a77-2b0c-48e2-b369-fbfa08fd367e", "embedding": null, "metadata": {"page_label": "33", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing EU Medicines Regulation through a Data Quality Maturity Model: Standards, Implementation, and Automation Framework with a Focus on Intrinsic Determinants\"", "questions_this_excerpt_can_answer": "1. What are the specific levels of maturity defined in the Data Quality Maturity Model for enhancing EU medicines regulation, and how do they contribute to improving data quality and management practices?\n\n2. How does the document describe the transition from manual to automated processes in the context of data quality management for EU medicines regulation, and what are the key benefits of achieving the highest level of maturity (automated)?\n\n3. What role do intrinsic determinants play in the assessment of data quality within the EU medicines regulation framework, and how are these determinants measured or evaluated at different maturity levels according to the document?", "prev_section_summary": "This section from the document titled \"Enhancing Data Quality in the European Medicines Agency through Documentation and FAIR Principles: A Guide to Developing Maturity Models\" focuses on the foundational determinants necessary for managing data quality (DQ) in the context of EU medicines regulation. It introduces a set of maturity levels designed to improve the characterization of foundational determinants, which are crucial for ensuring data quality. The document emphasizes the importance of implementing FAIR principles (Findability, Accessibility, Interoperability, and Reusability) for data and metadata as early as possible within these maturity models.\n\nThe maturity model for data quality determinants is outlined, suggesting a progression from documented processes to automated and standardized procedures, including feedback mechanisms and domain-defined metadata. At Level 1 maturity, the document specifies that processes related to data generation and manipulation must be documented, true, verifiable, and versioned to ensure data reliability for decision-making. This includes covering determinants for reliability, extensiveness, coherence, and timeliness, with a focus on preserving reliability through documentation of data processing and transformation by various actors throughout the data lifecycle.\n\nThe section also provides contact information for the European Medicines Agency (EMA), including its official address in Amsterdam, the Netherlands, and how to contact or visit the agency. The copyright notice indicates that the document is authorized for reproduction provided the source is acknowledged, with a copyright year of 2023.", "excerpt_keywords": "Data Quality Maturity Model, EU Medicines Regulation, Automated Processes, Intrinsic Determinants, FAIR Principles"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nfrom a metadata perspective, this means metadata (in some form) should always accompany a dataset it refers to.\n\nto guarantee the quality of data, audit procedures or other controls should be in place.\n\nwhen a system is designed for continuous data collection (as opposed to a one-off capture), additional processes of performance monitoring and improvement should be in place.\n\n# 7.1.2. level 2: formalised\n\nthe second level of the maturity model includes and extends the first level, by requiring that, whenever possible, documentation and metadata should be following an industry standard framework or qms 32. level 2 should be considered the minimal level of acceptable maturity, though exceptions may arise for novel data types. the recommendation to use standards extends to metadata.\n\n# 7.1.3. level 3: implemented\n\nsystems are in place that implement industry standard dq processes systematically and by design. infrastructure should be in place to support data management, including support for standardisation (e.g., reference data management or mdm). by reducing the potential for human error, such an implementation can generally improve reliability and coherence. such an implementation may also be necessary to guarantee timeliness and it should ensure that metadata are collected by design, and as close to the data generation or collection events as possible.\n\n# 7.1.4. level 4: automated\n\nthe operations and output of the above systems and infrastructure should be machine readable as to unify data and dq elements for direct downstream consumption. all data and metadata should be represented following fair principles [15] to allow complete automatic processing of quality parameters this is intended to be an aspirational level.\n\n# 7.2. intrinsic determinants: recommendations and maturity levels\n\nbeyond documented evidence of how data were collected or generated, measures of intrinsic aspects of dq can be applied. these can be directly derived from the dataset, but their computation could also rely on some external body of knowledge.\n\n# 7.2.1. level 0: intrinsic\n\nthere are no hard minimal requirements for quality, as any piece of data can be assessed before being used to generate evidence 33. nevertheless, the propagation of data without an associated quality assessment should be discouraged.\n\n# 7.2.2. level 1: metadata\n\ndata are provided with a set of quality metrics as metadata. some of these data can be directly derived from the dataset while other derive from the overall data collection process (e.g., sampling).\n\n32 what industry standards or frameworks applies depend on specific data types and use cases, and as such can be defined only in specialisations of this framework. some initial references are however provided in the \"implementation notes\" session.\n\n33 this initial level is assigned \"0\" to clarify that it corresponds to data \"as is\" irrespective of their intended use for regulatory decision making.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 33/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4c86b6e3-3ff5-471b-b53c-d1cdd4429c1c": {"__data__": {"id_": "4c86b6e3-3ff5-471b-b53c-d1cdd4429c1c", "embedding": null, "metadata": {"page_label": "34", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Regulatory Decision Making through a Data Quality Framework: Emphasizing Standardization, Automation, and Continuous Improvement\"", "questions_this_excerpt_can_answer": "1. How does the Data Quality Framework for EU Medicines Regulation propose to standardize data quality metrics across datasets, and what role do shared definitions play in this process?\n\n2. What are the specific steps outlined in the framework for automating the quality assessment of data, and how do FAIR principles integrate into this automation process?\n\n3. How does the framework suggest handling the assessment of data quality in relation to specific regulatory questions, and what maturity levels are defined for ensuring data relevance and precision for decision-making purposes?", "prev_section_summary": "This section outlines the Data Quality Maturity Model designed to enhance EU medicines regulation through improved data quality and management practices. It describes four levels of maturity:\n\n1. **Level 2: Formalised** - This level builds upon the initial stage by requiring documentation and metadata to adhere to an industry standard framework or Quality Management System (QMS). It is considered the minimal acceptable level of maturity, emphasizing the importance of standards for metadata and documentation.\n\n2. **Level 3: Implemented** - At this stage, systems are established to systematically implement industry-standard data quality (DQ) processes. Infrastructure supports data management and standardization, such as reference data management or Master Data Management (MDM), aiming to reduce human error and ensure reliability, coherence, and timeliness. Metadata collection is integrated by design, close to data generation or collection events.\n\n3. **Level 4: Automated** - This aspirational level involves making operations and outputs machine-readable to unify data and DQ elements for direct downstream consumption. Data and metadata are organized following FAIR principles (Findable, Accessible, Interoperable, and Reusable) to facilitate complete automatic processing of quality parameters.\n\nThe section also discusses **intrinsic determinants** of data quality, highlighting the importance of assessing the intrinsic aspects of data quality, which can be derived from the dataset itself or an external body of knowledge. It introduces a preliminary level (Level 0: Intrinsic) where data can be assessed before use to generate evidence, emphasizing that data propagation without quality assessment should be discouraged. At Level 1: Metadata, data are accompanied by quality metrics as metadata, derived either directly from the dataset or the overall data collection process.\n\nThe document underscores the transition from manual to automated processes in data quality management for EU medicines regulation and the benefits of achieving the highest level of maturity, including improved reliability, coherence, and timeliness of data. It also highlights the role of intrinsic determinants in assessing data quality at different maturity levels.", "excerpt_keywords": "data quality, EU medicines regulation, FAIR principles, automation, standardization"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# metadata should also cover the description of data elements that are necessary for its interpretation (e.g., data dictionaries).\n\n# 7.2.3. level 2: standardised\n\ndata are provided with a standardised set of quality metrics, which can be compared across datasets. when applicable or possible, standards should extend to cover reference knowledge that can be used to assess a dataset in respect to what it is meant to represent (e.g., typical population distributions to assess biases). metadata makes use of shared definitions, which also enable comparability and integration across datasets.\n\n# 7.2.4. level 3: automated\n\nquality assessment is automated (at least for a large extent of metrics). in general, this is feasible only when data are represented in standard ways (e.g., in a cdm), so that a standard library of tests can be run on incoming data. data and metadata should follow fair principles [16].\n\n# 7.2.5. level 4: feedback\n\nthere is a data ecosystem in place so that quality assessment by data consumers can provide feedback to improve the data collection and production process, thus allowing a continuous monitoring and improvement of dq.\n\n(note that the order of maturity of level 2 and 3 may change for particular data types.)\n\n# 7.3. question-specific determinants: recommendations and maturity levels\n\nin general, it is not possible to assess the relevance of a dataset, or aspects of extensiveness and precision, without a target question and a defined analysis strategy. however, when considering the adoption of a large body of data for regulatory decision making and its possible use beyond primary use cases, it becomes important to articulate to what degree dq, including relevance, can be assessed a-priori.\n\n# 7.3.1. level 1: ad-hoc\n\nall dimensions that are question specific are assessed only at \"question time\" on an ad-hoc basis.\n\n# 7.3.2. level 2: domain-defined\n\na range of common questions is identified, from which metrics and thresholds can be derived that can be used to guarantee acceptable levels of quality. data published in data catalogues should make use of such metrics.\n\n# 7.3.3. level 3: question-defined\n\nthe requirements for a specific question are precisely codified and can be mapped to metrics and thresholds in a way that could automatically assess the relevance of a dataset for a specific question. at this level, the context under which data will be interpreted for decision making should be formalised.\n\nas for foundational determinants, fair principles should be applied as early as possible, at least partially. level 3 requires a full implementation of fair principles.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 34/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "32fc2986-e25c-49ec-bbc9-521b901c5732": {"__data__": {"id_": "32fc2986-e25c-49ec-bbc9-521b901c5732", "embedding": null, "metadata": {"page_label": "35", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Data Integrity in EU Medicines Regulation: A Comprehensive Guide to Data Quality Frameworks, Quality at Source, Master Data Management, and the Role of Quality Management Systems\"", "questions_this_excerpt_can_answer": "1. How does the document suggest addressing data quality (DQ) issues as early as possible in the data collection and generation processes, and what specific example does it provide to illustrate the importance of this approach?\n \n2. What role does Master Data Management (MDM) and reference data play in ensuring data quality within the EU medicines regulation framework, and how does it specifically impact data consistency and reliability according to the document?\n\n3. How does the document outline the integration of Quality Management Systems (QMS) and computerised systems in enhancing data quality for EU medicines regulation, including specific standards and directives for different types of data (e.g., clinical trial data, medical devices data)?", "prev_section_summary": "The section from the document \"Enhancing Regulatory Decision Making through a Data Quality Framework: Emphasizing Standardization, Automation, and Continuous Improvement\" outlines a structured approach to improving data quality in the context of EU medicines regulation. It introduces a multi-level framework aimed at standardizing, automating, and continuously improving the quality of data used in regulatory decision-making. Key topics and entities include:\n\n1. **Metadata and Data Dictionaries**: The importance of comprehensive metadata and data dictionaries for interpreting data elements is highlighted, emphasizing the need for clear descriptions.\n\n2. **Standardization (Level 2)**: The framework proposes standardizing data quality metrics across datasets to enable comparability and integration. This involves using shared definitions and, where possible, extending standards to include reference knowledge for dataset assessment.\n\n3. **Automation (Level 3)**: It suggests automating the quality assessment process to a significant extent, which is feasible when data are standardized (e.g., through a common data model, CDM). Automation should adhere to FAIR (Findable, Accessible, Interoperable, Reusable) principles to ensure data and metadata are handled efficiently.\n\n4. **Feedback and Continuous Improvement (Level 4)**: A data ecosystem that allows for feedback from data consumers is advocated to enhance the data collection and production processes. This feedback mechanism facilitates continuous monitoring and improvement of data quality (DQ).\n\n5. **Question-Specific Determinants**: The document acknowledges the challenge of assessing data relevance and quality without a specific regulatory question and analysis strategy in mind. It outlines three maturity levels for addressing question-specific determinants of data quality:\n - **Ad-Hoc (Level 1)**: Quality assessment is performed on an ad-hoc basis at the time of the question.\n - **Domain-Defined (Level 2)**: Identifies a range of common questions to derive metrics and thresholds for ensuring acceptable quality levels, which are then applied to data in catalogues.\n - **Question-Defined (Level 3)**: Precisely codifies requirements for specific questions, allowing for the automatic assessment of a dataset's relevance to a particular question. This level necessitates a full implementation of FAIR principles.\n\nOverall, the section outlines a comprehensive framework for enhancing the quality of data used in regulatory decision-making within the EU medicines regulation context, focusing on standardization, automation, feedback mechanisms, and the application of FAIR principles.", "excerpt_keywords": "Data Quality Framework, EU Medicines Regulation, Master Data Management, Quality Management Systems, Data Integrity"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# and shareable. this is the natural level for primary use cases, while for secondary use of data, this should be intended as an aspirational level.\n\n8. considerations for implementation of dqf\n\nthis section provides a set of observations and recommendations to guide the implementation of this dqf and its specialisations, to help achieving higher levels of maturity.\n\n# 8.1. quality at source\n\nas a general guideline, in designing data collection and generation processes, aspects of dq should be addressed as early as possible. for instance, assessment of quality done close to the moment of production can help in correcting a collection error. the further data travels from the original context, the harder it becomes to correct issues. this is particularly relevant for metadata as knowledge of the context of data generation is maximally present only at generation time.\n\n# 8.2. the role of master data management (mdm) and reference data\n\nthe availability of mdm and reference data has a direct impact on dq. it is often a pre-requisite for data consistency, and it can even impact reliability in some data production scenarios (e.g., materials data), as disconnected information can result in erroneous information. more broadly, mdm and reference data enable automation of a range of dq checks and hence have an impact on reliability as well.\n\nshared mdm and reference data can address aspects of coherence beyond the primary use case that the data were generated for, as the use of standards guarantees some level of semantic coherence is maintained even beyond data aggregation steps.\n\n# 8.3. the role of qms and computerised systems\n\nthe implementation of a dqf at higher maturity levels requires the formalisation and implementation of systems and processes to support dq.\n\na quality management system (qms) [20] is a formalised approach adopted by an organisation that documents processes, procedures, and responsibilities for achieving quality policies and objectives (e.g., good clinical practices [gcp], good laboratory practices [glp] or good manufacturing practice [gmp]). it achieves these quality objectives through quality planning, quality assurance, quality control and quality improvement. standards like the iso 9000 family define qms across industries, while more specific qms have been developed for specific industry or products. life science industry specific qmss should be considered depending on the nature of the data:\n\nclinical trial data: iso 14155 and eu directive 2001/20/ec for gcp (clinical trial data)\ndata from medical devices or diagnostic products: iso 13485 quality system regulation (qsr)\ndata from lab research: eu directive 2004/9/ec and 2004/10/ec, glp\ndata from clinical labs: iso 15189 and iso 17025\n\nwhenever possible dq processes should be framed in the context of standard qmss.\n\nfurthermore, in todays digital world, foundational dq determinants are also impacted by computerised systems, that are used to create, modify, maintain, archive, retrieve, or transmit data. a software development life cycle including software quality assurance system ensures the appropriate design, development and testing of the software. this can be targeted through the iso 250xx standard family,\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 35/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "41c60b4d-8b61-45e3-a9ac-b76f19cc786b": {"__data__": {"id_": "41c60b4d-8b61-45e3-a9ac-b76f19cc786b", "embedding": null, "metadata": {"page_label": "36", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Ensuring Data Integrity in Clinical Trials: Implementing Quality Management Systems and Standards\"", "questions_this_excerpt_can_answer": "1. How does the ISO 25012 standard specifically address data models within the context of ensuring data integrity in clinical trials?\n \n2. What specific guidance does the EMA/226170/2021 guideline offer regarding the implementation of computerised systems and electronic data in clinical trials, and how does it suggest incorporating Quality Management Systems (QMS) for a Data Quality Framework (DQF) implementation?\n\n3. How does the ISO 8000 standard differ from ISO 9001 in the context of data quality management, especially regarding the treatment of data as a product within the pharmaceutical industry's regulatory framework?", "prev_section_summary": "This section of the document focuses on the implementation of a Data Quality Framework (DQF) for EU medicines regulation, offering observations and recommendations to achieve higher levels of data quality maturity. Key topics include:\n\n1. **Quality at Source**: It emphasizes the importance of addressing data quality (DQ) issues as early as possible in the data collection and generation processes. An example provided is the assessment of quality close to the moment of production to correct collection errors, highlighting the challenge of correcting data the further it moves from its original context.\n\n2. **Master Data Management (MDM) and Reference Data**: The document discusses the critical role of MDM and reference data in ensuring data consistency and reliability. It notes that MDM and reference data are often prerequisites for data consistency and can significantly impact reliability, especially in scenarios like materials data. The automation of DQ checks facilitated by shared MDM and reference data is also highlighted, along with the maintenance of semantic coherence beyond primary data use cases.\n\n3. **Quality Management Systems (QMS) and Computerised Systems**: The integration of QMS and computerised systems in supporting DQ is outlined. The document references specific standards and directives relevant to different types of data, such as clinical trial data (ISO 14155 and EU Directive 2001/20/EC), medical devices data (ISO 13485), lab research data (EU Directives 2004/9/EC and 2004/10/EC), and clinical labs data (ISO 15189 and ISO 17025). It stresses the importance of framing DQ processes within the context of standard QMS and highlights the role of computerised systems and software quality assurance in maintaining DQ, referencing the ISO 250xx standard family.\n\nEntities mentioned include:\n- **Data Quality (DQ)**\n- **Master Data Management (MDM)**\n- **Reference Data**\n- **Quality Management System (QMS)**\n- **ISO Standards** (ISO 9000, ISO 14155, ISO 13485, ISO 15189, ISO 17025, ISO 250xx)\n- **EU Directives** (2001/20/EC, 2004/9/EC, 2004/10/EC)\n\nThe section underscores the multifaceted approach required to enhance data integrity within EU medicines regulation, involving early intervention in data generation, leveraging MDM and reference data for consistency, and integrating QMS and computerised systems for comprehensive quality assurance.", "excerpt_keywords": "Data Quality Framework, ISO 25012, EMA/226170/2021, Quality Management Systems, Computerised Systems Validation"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# systems and software quality requirements and evaluation\n\n(more specifically, data models can be addressed with iso 25012). computer system qualification and validation ensures the software is appropriately implemented, and necessary process controls are in place for using it according to its specifications, including documentation, access control, vendor management and audits. the ema/226170/2021 guideline on computerised systems and electronic data in clinical trials provides direction for gcp but can be adopted more broadly. the following decision tree provide guidance on how to consider qmss for a dqf implementation (see figure 6).\n\n|existing regulation/guideline for data type?|quality focus on data or product?|\n|---|---|\n|yes|product is the output|\n|no, the data type is not regulated|data are the output|\n\nimplement guideline directives or iso standards\n\nimplement software development life cycle\n\nimplement quality management system for specific product\n\nfigure 6 - decision tree for qms adoption in dqf implementation\n\n# the role of iso and industry standards\n\nthe international organisation for standardisation (iso) has produced standards providing frameworks for the implementation of various data management aspects, that are field tested and for which platforms, supporting services and certification bodies are established. these standards are often developed for implementation of industries where emas regulatory decision making does not apply.\n\niso 9000: describes the standards for quality management systems on all levels of an organisation. the adoption of this standard could be considered if no industry specific qms applies.\n\niso 8000: describes the standards for the quality of master data and their exchange between systems. the standard describes how master data conform to a set of specification expressed in a formal syntax and use specified identifiers to check against data requirements that point to a data dictionary. this standard also covers the methods for achieving data governance, data quality management, data quality assessment and rules for determining the quality of master data and industrial data. this includes the exchange of characteristics of data and identifiers and data.\n\nto clarify, the iso 8000 series does not establish a new management system. the series, instead, extends and clarifies iso 9001 for the case where data are the product.\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 36/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "4dee3614-132d-4f05-9e97-ff9c83e04950": {"__data__": {"id_": "4dee3614-132d-4f05-9e97-ff9c83e04950", "embedding": null, "metadata": {"page_label": "37", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Regulatory Decision Making in the Pharmaceutical Industry through the Integration of Data Quality Standards and Principles\"", "questions_this_excerpt_can_answer": "1. How does the ISO 25012 standard contribute to regulatory decision-making in the pharmaceutical industry, and what are its key components for ensuring data quality throughout the data lifecycle?\n \n2. In what ways does the ALCOA+ framework align with the dimensions of the Data Quality Framework (DQF) mentioned in the document, and how does it specifically enhance the reliability of data in regulatory submissions within the pharmaceutical sector?\n\n3. What are the challenges and considerations associated with implementing data quality controls, such as validation against source records, within the context of EU medicines regulation as outlined in the document?", "prev_section_summary": "This section discusses the importance of data integrity in clinical trials, focusing on the implementation of Quality Management Systems (QMS) and standards to ensure the quality of data. It highlights the role of ISO 25012 in addressing data models and emphasizes the significance of computer system qualification and validation as outlined in the EMA/226170/2021 guideline. This guideline provides direction for Good Clinical Practice (GCP) and suggests a broader adoption for ensuring data integrity through a Data Quality Framework (DQF).\n\nA decision tree is presented to guide the implementation of QMS for DQF, distinguishing between regulated data types focusing on product output versus unregulated data types where data are the output. It suggests following guideline directives or ISO standards, implementing software development life cycles, and adopting quality management systems for specific products.\n\nThe section also compares ISO 8000 and ISO 9001 standards in the context of data quality management. ISO 8000 focuses on the quality of master data and their exchange, extending and clarifying ISO 9001 to treat data as a product. It covers data governance, quality management, assessment, and rules for determining the quality of master data and industrial data.\n\nKey entities mentioned include:\n- ISO 25012: Standard for data models in ensuring data integrity.\n- EMA/226170/2021 guideline: Provides direction on computerised systems and electronic data in clinical trials.\n- ISO 8000: Standard for the quality of master data and their exchange.\n- ISO 9001: Standard for quality management systems.\n- Data Quality Framework (DQF): A framework for implementing QMS to ensure data integrity.", "excerpt_keywords": "Data Quality Framework, ISO 25012, ALCOA+, EU Medicines Regulation, Regulatory Decision Making"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nprocessing like creating, collecting, storing, maintaining, transferring, exploiting, and presenting data to deliver information. this standard is valuable master data play an integrate role in the process of generation of data for regulatory decision making.\n\niso 25012: defines a general data quality model for data retained in a structured format within a computer system, typical for data considered for regulatory decision making. it provides a framework for establishing data quality requirements, data quality measures, and a plan to perform data quality evaluations. it could be used across the entire life cycle from data collection or generation, management and processing, publishing, aggregation, and consumption and to evaluate the compliance of data with regulations. an example of implementation of this standard is done by statistics finland, which is also in line with the european interoperability framework, the fair principles and the code of practice for statistics [13].\n\niso 13485: specifies requirements for a qms for an organisation to design, build and obtain authorisation for medical devices that consistently meet customer and regulatory requirements. as with all qms, this standard focusses on the quality of the product, and affects dq as they are relevant for the design, development, production, and use of the device.\n\n# 8.5. notes on alcoa\n\nalcoa+ is a framework for data integrity used across the pharmaceutical industry in areas such as research, manufacturing, testing and supply chain. it postulates a set of principles that data and its documentation should comply to. in the specifics, data should be attributable, legible, contemporaneous, original, accurate (alcoa). the + refers to the following attributes: consistent, enduring, available, traceable. more information on these principles is available in the guideline on computerised systems and electronic data [21] and in [22].\n\nin relation to this framework, alcoa+ provides recommendations that focus on foundational determinants and that affect primarily the reliability, but also the extensiveness, coherence and timeliness dimensions.\n\nwhen considering the alcoa+ principles, they can be closely aligned to the dimensions of the present dqf, with the caveat that the alcoa definitions are more focused and operational. for instance, the alcoa definition of \"accurate\" is expressing a set of characteristics (e.g., verifiable coding processes, validated data transfer) that should be in place so that \"data should be an accurate representation of the observation made\", that is how reliability (and more precisely accuracy) is defined in this dqf. other principles such as \"legible\" and \"original\" also falls under the reliability dimension (as they answer the question \"is data reflecting reality?\", but they are not explicitly articulated in this framework as they are a pre-condition to a regulatory submission.\n\nfor suitable use cases, alcoa+ compliant specifications can enable level 2 (formalised) and above maturity levels for foundational determinants.\n\n# 8.6. notes on implementation of dq controls\n\nthere are different possible implementations of data quality controls (i.e., testing).\n\nif the true facts the data are representing are known and accessible, data can be tested using validation vs the source records containing these facts (see framework of reference session above). however, validation can be costly and time consuming, and often requires the use of adjudicators if the\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 37/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cd998e2b-0c48-46ba-b8b0-226a27b673cb": {"__data__": {"id_": "cd998e2b-0c48-46ba-b8b0-226a27b673cb", "embedding": null, "metadata": {"page_label": "38", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Enhancing Data Quality in EU Medicines Regulation: A Framework for Accuracy with Intrinsic Plausibility Metrics and External Validation\"", "questions_this_excerpt_can_answer": "1. How does the Data Quality Framework for EU Medicines Regulation propose to handle data that is not available in a machine-readable structured form for the purpose of enhancing data quality?\n\n2. What specific methods does the framework suggest for testing data quality within the same dataset to ensure logical consistency and factual accuracy, particularly in relation to the timing of causal effects and gender-specific observations?\n\n3. What role do external reference ranges and plausible trends play in the validation process of data under the EU Medicines Regulation Data Quality Framework, and can you provide examples of how these methods are applied to prevent unrealistic data entries, such as a blood pressure reading of 1000/500 mmHg or implausible disease incidence growth rates?", "prev_section_summary": "This section discusses the integration of data quality standards and principles to enhance regulatory decision-making in the pharmaceutical industry, focusing on the ISO 25012 standard, the ALCOA+ framework, and the implementation of data quality controls within the context of EU medicines regulation.\n\n1. **ISO 25012 Standard**: It defines a general data quality model for data in a structured format within computer systems, crucial for regulatory decision-making. The standard provides a framework for establishing data quality requirements, measures, and evaluations across the data lifecycle, ensuring compliance with regulations. An implementation example by Statistics Finland aligns with the European Interoperability Framework, FAIR principles, and the Code of Practice for Statistics.\n\n2. **ALCOA+ Framework**: A framework for data integrity in the pharmaceutical industry, emphasizing principles that data and documentation should adhere to: Attributable, Legible, Contemporaneous, Original, and Accurate (ALCOA), with the \"+\" indicating Consistent, Enduring, Available, Traceable. ALCOA+ focuses on foundational determinants affecting the reliability, extensiveness, coherence, and timeliness dimensions of data quality. It aligns closely with the dimensions of the Data Quality Framework (DQF), particularly in ensuring data accuracy and reliability for regulatory submissions.\n\n3. **Implementation of Data Quality Controls**: Discusses the challenges and considerations in implementing data quality controls, such as validation against source records. Validation is essential for ensuring data accurately represents true facts but can be costly and time-consuming, often requiring adjudicators.\n\nThe section highlights the importance of integrating data quality standards like ISO 25012 and principles like those in the ALCOA+ framework to ensure the reliability and accuracy of data in regulatory submissions within the pharmaceutical sector. It also addresses the practical challenges of implementing data quality controls within the EU medicines regulation context.", "excerpt_keywords": "Data Quality Framework, EU Medicines Regulation, Intrinsic Plausibility Metrics, External Validation, Logical Consistency"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\nfacts are not available in machine-readable structured form. alternatively, data can be tested via intrinsic plausibility metrics, and specifically by assessing the dataset respect to (see figure 7):\n\nother data in the same dataset: the test would detect logical or factual contradictions (e.g., embedding background knowledge on relations between entities and events). for example, the timing of a causal effect must occur after its causing intervention, or a female patients cannot have observations only occurring in males.\n\nexternal reference ranges (or gold standards): some measured quantity cannot exceed a certain magnitude, such as a blood pressure of 1000/500 mmhg.\n\nplausible trends: certain data can be valid when observed individually, but the collective trend of all data of a kind should follow expected distributions or trends. for example, the incidence of a disease is unlikely to grow drastically from 2% to 80% in a population from one year to another, or the exposed cell line in an experiment cannot show less effect than the unexposed comparator. in this case data are assessed with respect to background knowledge on typical characteristics of data.\n\n|true facts accessible?|yes|\n|---|---|\n| |no|\n\nimplement system of external validation\n\nimplement testing of data against other data, external reference ranges and plausible trends\n\nfigure 7 - overview of external reference ranges\n\ndata quality framework for eu medicines regulation\n\nema/326985/2023\n\npage 38/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "abdbd229-f0f6-44a8-8533-050fd3402e54": {"__data__": {"id_": "abdbd229-f0f6-44a8-8533-050fd3402e54", "embedding": null, "metadata": {"page_label": "39", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "Title: \"Fundamentals of Data Management and Quality Assessment: A Comprehensive Guide\"", "questions_this_excerpt_can_answer": "1. How does the document define the concept of data immutability within the framework of EU medicines regulation, and what implications does this have for data management practices in health research and policy making?\n\n2. In the context of the EU medicines regulation data quality framework, how are data quality determinants categorized, and what specific aspects do these categories address in relation to ensuring the integrity and reliability of data used in health research and policy making?\n\n3. What are the defined metrics for assessing data quality according to the document, and how do these metrics facilitate the evaluation of data's fitness for purpose in the fields of health research, policy making, and regulation within the EU context?", "prev_section_summary": "The section discusses the Data Quality Framework for EU Medicines Regulation, focusing on enhancing data quality through various methods when data is not available in a machine-readable structured form. Key topics include:\n\n1. **Intrinsic Plausibility Metrics**: The framework proposes testing data within the same dataset for logical consistency and factual accuracy. This involves checking for contradictions, such as ensuring the timing of a causal effect follows the causing intervention and verifying gender-specific observations are accurate (e.g., female patients should not have conditions exclusive to males).\n\n2. **External Reference Ranges (or Gold Standards)**: To prevent unrealistic data entries, the framework suggests using external reference ranges. An example given is that a blood pressure reading should not exceed a certain limit, like 1000/500 mmHg, indicating a measure for preventing implausible data entries.\n\n3. **Plausible Trends**: The framework emphasizes the importance of data trends following expected distributions or historical trends to ensure data validity. It highlights how drastic changes in disease incidence rates or experimental results that contradict expected outcomes (e.g., an exposed cell line showing less effect than an unexposed comparator) should be scrutinized for plausibility.\n\n4. **External Validation**: The document suggests implementing a system of external validation to complement the intrinsic tests and ensure data quality.\n\nEntities mentioned include:\n- EU Medicines Regulation\n- EMA (European Medicines Agency) with a document reference number (EMA/326985/2023)\n- Specific examples like blood pressure readings and disease incidence rates to illustrate the application of the framework's methods.\n\nOverall, the section outlines a comprehensive approach to ensuring data quality in EU medicines regulation by combining internal consistency checks with external validation against known standards and trends.", "excerpt_keywords": "data quality, data immutability, EU medicines regulation, data integrity, data quality metrics"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# glossary\n\n|definitions|explanation|\n|---|---|\n|data accessibility|the ability of data to be accessible for public use in terms of discoverability, exportability, and usability.|\n|data conciseness|the characteristic of data to be expressed in a compact representation. sometimes also defined as the characteristic of data to include only essential, and not spurious, information.|\n|data immutability|data immutability is the concept that data is never deleted or altered. once some data is \"stated\" (e.g.: entered in a database), it can only be augmented (eventually with additional information meant to invalidate or supersede previous data) but never remoted. in other words, data that has been entered in a system (and on which some other data or actions may depend) cannot be changed without explicitly mentioning of a new state of the information and maintaining the knowledge of the previous state.|\n|data integrity|data integrity refers to the maintenance and assurance of data reliability and consistency over time, encompassing the whole data life cycle. it is a broader concept than data quality, whose precise definition varies across contexts, extending from physical to logical aspects of data processing and storage.|\n|data quality metrics|dq metrics can be defined as indicators that can be applied to a data source to derive assessments of one of more quality dimensions.|\n|data quality|data quality is defined as fitness for purpose for users needs in relation to health research, policy making, and regulation and that the data reflect the reality, which they aim to represent. data quality is relative to the research question and does not address the question on what level is the quality measured e.g., variable, data source or institutional level. these aspects are addressed in the data quality determinants and dimensions of data quality.|\n|data quality determinants|what contributes to data quality or its characterisation. in this framework determinants are classified into three categories: foundational determinants: what affects the quality of a dataset, being external to the dataset itself (e.g., systems and processes that generate data). intrinsic determinants: what can be derived in terms of quality for a dataset itself, without information on how the data came to or its intended usage. question specific determinants: what affects the assessment of a dataset quality, that strictly depends on the dataset intended or actual usage.|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "cc6d2e9e-5ab4-472d-94cc-ddee579bd55e": {"__data__": {"id_": "cc6d2e9e-5ab4-472d-94cc-ddee579bd55e", "embedding": null, "metadata": {"page_label": "40", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Comprehensive Guide to Understanding and Implementing Data Quality Management and Assessment\"", "questions_this_excerpt_can_answer": "1. What are the five dimensions of data quality as outlined in the EU medicines regulation data quality framework, and how can they be further divided into sub-dimensions?\n \n2. How does the data quality framework presented in the document specifically cater to the needs of regulatory decision-making within the EU medicines regulation context?\n\n3. Can you explain the difference between foundational and intrinsic determinants of data quality as defined in the \"Comprehensive Guide to Understanding and Implementing Data Quality Management and Assessment\" and provide examples of what might constitute each within the context of healthcare data management?", "prev_section_summary": "This section provides a glossary of terms related to data management and quality assessment within the context of EU medicines regulation, focusing on the concepts crucial for health research, policy making, and regulation. The key terms defined include:\n\n1. **Data Accessibility**: This refers to the ease with which data can be found, exported, and used by the public.\n\n2. **Data Conciseness**: This term describes data that is compact and contains only essential information, avoiding any unnecessary details.\n\n3. **Data Immutability**: A concept where once data is entered into a database, it cannot be deleted or altered but can only be augmented with additional information to invalidate or supersede the previous data. This ensures that the original state of the data is always preserved and traceable.\n\n4. **Data Integrity**: This encompasses the maintenance and assurance of data reliability and consistency throughout its lifecycle, covering both physical and logical aspects of data processing and storage.\n\n5. **Data Quality Metrics (DQ Metrics)**: These are indicators applied to a data source to assess its quality across various dimensions.\n\n6. **Data Quality**: Defined as the suitability of data to meet user needs in health research, policy making, and regulation, ensuring that the data accurately reflects the reality it aims to represent. The concept of data quality is relative and considers the level at which quality is measured (e.g., variable, data source, or institutional level).\n\n7. **Data Quality Determinants**: Factors contributing to or characterizing data quality, classified into three categories:\n - **Foundational Determinants**: External factors affecting the quality of a dataset, such as the systems and processes that generate data.\n - **Intrinsic Determinants**: Quality aspects that can be assessed directly from the dataset itself, without additional information on data origin or intended use.\n - **Question Specific Determinants**: Factors influencing the assessment of a dataset's quality based on its intended or actual use.\n\nThis glossary forms a comprehensive framework for understanding and evaluating data quality in the context of EU medicines regulation, highlighting the importance of data management practices in ensuring the integrity and reliability of data used in health research and policy making.", "excerpt_keywords": "data quality, EU medicines regulation, healthcare data management, data quality framework, intrinsic determinant"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# definitions\n\n|definitions|explanation|\n|---|---|\n|data quality dimensions|data quality aspects are partitioned into different group that answers different questions about data. such partitions are called \"dimensions\". this framework distinguishes five dimensions that can be divided further into sub-dimensions: 1) extensiveness, 2) coherence, 3) timeliness, 4) relevance, and 5) reliability.|\n|data quality framework|a data quality framework provides a set of definitions, guidelines, and recommendation to assess and govern data quality. the framework here presented addresses a wide range of data sources for the purpose of characterising, assessing, and assuring data quality for regulatory decision making.|\n|entity|an entity is a collection of similar values that belong to a specific variable (e.g., weight). this is also referred to as row level.|\n|healthcare data|medical data gathered from different settings containing various clinical measurements of specific populations. in most cases this is electronically stored data known as electronic health data.|\n|fit for purpose|possessing all required data quality characteristic needed to address a specific goal. the emphasis of data quality is ensuring that the data are fit for purpose for reliable assessments of whether the data are fit for the purpose of decision making to supporting health research and population health.|\n|foundational determinant|a data quality determinant that covers aspects related to the generation of data. it affects the quality of data, but its not part of the data themselves e.g., software systems, training, audit processes. it can be seen as data generation specific.|\n|intrinsic determinant|a data quality determinant that covers aspects that are inherent to a given dataset e.g., the formatting of the data. this can be seen as a dataset specific determinant.|\n|maturity model|a maturity model is a framework for assessing processes, technology and structure of an organisation or function. it provides a structured approach to evaluating how well an organization or a function manages its data quality processes, policies, and practices. the model defines key characteristics at each level to guide measure continuous improvement in data quality over time.|\n|mdm|master data management, a system that helps synthesise data from different systems and secure and clean it (eliminates duplications etc) to deal with the right information.|\n|metadata|metadata are defined as \"data about data\" providing context about their purpose and generation. its a set of data that describes and gives information on other data providing context about their purpose, location, key-variables, generation, format, and ownership of a dataset. metadata are often published in data catalogues, which have|", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "c271f2ef-db22-4c9e-8a1d-7b286c5e1512": {"__data__": {"id_": "c271f2ef-db22-4c9e-8a1d-7b286c5e1512", "embedding": null, "metadata": {"page_label": "41", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Maximizing the Value of Electronic Health Data: Exploring Primary, Secondary Uses and Enhancing Data Discoverability\"", "questions_this_excerpt_can_answer": "1. What is the defined purpose of making health data discoverable according to the \"Maximizing the Value of Electronic Health Data\" document, and how does it ensure the protection of personal information?\n \n2. How does the document \"Maximizing the Value of Electronic Health Data\" differentiate between the primary and secondary uses of electronic health data, and what examples does it provide for each category?\n\n3. In the context of EU medicines regulation, as outlined in the \"Maximizing the Value of Electronic Health Data\" document, what are the implications for data quality when utilizing electronic health data for secondary purposes such as scientific research or national statistics?", "prev_section_summary": "The section provides an overview of the data quality framework specifically designed for EU medicines regulation, outlining key concepts and definitions essential for understanding and implementing data quality management and assessment. The framework identifies five main dimensions of data quality: extensiveness, coherence, timeliness, relevance, and reliability, which can be further divided into sub-dimensions. It emphasizes the importance of assessing and assuring data quality for regulatory decision-making, highlighting the use of various data sources in this context.\n\nKey entities discussed include:\n- Data quality dimensions: Different aspects of data quality divided into specific groups or dimensions.\n- Data quality framework: A set of guidelines and recommendations for assessing and governing data quality.\n- Entity: Refers to a collection of similar values belonging to a specific variable.\n- Healthcare data: Medical data collected from various settings, primarily stored electronically.\n- Fit for purpose: A characteristic of data quality indicating that the data meet all requirements for a specific goal.\n- Foundational determinant: Aspects related to data generation that affect data quality but are external to the data themselves.\n- Intrinsic determinant: Aspects inherent to a dataset that determine its quality.\n- Maturity model: A framework for assessing an organization's data quality management processes and practices.\n- MDM (Master Data Management): A system for synthesizing, securing, and cleaning data from different sources.\n- Metadata: Data that provide context about other data, including purpose, generation, and ownership.\n\nThe section underscores the significance of a structured approach to data quality, focusing on the needs of regulatory decision-making within the EU medicines regulation context. It differentiates between foundational and intrinsic determinants of data quality, offering insights into how each impacts healthcare data management.", "excerpt_keywords": "data quality, electronic health data, EU medicines regulation, primary use, secondary use"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# definitions\n\nexplanation\nthe purpose of allowing data to be discoverable and checked for fitness for purpose, without revealing the data themselves.\n\n# primary use of data\n\nprimary use of data\nprimary use of (electronic) health data is the processing of personal health data for the provision of health services to assess, maintain or restore the state of health of the person it belongs it, including the prescription, dispensation and provision of medicinal products and medical devices, as well as for relevant social security, administrative or reimbursement services.\n\n# secondary use of data\n\nsecondary use of data\nsecondary use of (electronic) health data is the processing of health data for other purposes rather than primary use such as national statistics, education/teaching, scientific research etc. the data used may include personal health data initially collected in the context of primary use, but also electronic health data collected for the purpose of secondary use.", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}, "7e79e502-e90a-49ef-a6a1-0c1e7d935378": {"__data__": {"id_": "7e79e502-e90a-49ef-a6a1-0c1e7d935378", "embedding": null, "metadata": {"page_label": "42", "file_name": "[37] Data quality framework EU medicines regulation.pdf", "file_path": "/content/drive/MyDrive/Desarrollo Pharma.IA/PharmaWise Engineer/PharmaWise CSV & Data Integrity/raw_data/[37] Data quality framework EU medicines regulation.pdf", "file_type": "application/pdf", "file_size": 1164991, "creation_date": "2024-05-23", "last_modified_date": "2024-05-23", "document_title": "\"Ensuring Excellence in Healthcare and Pharmaceuticals: A Comprehensive Guide to Data Quality Frameworks, Standards, and Regulatory Guidelines\"", "questions_this_excerpt_can_answer": "1. What specific strategies and frameworks have been developed within the European Union to standardize and ensure the quality of data in the medicines regulatory network as of December 16th, 2021?\n \n2. How does the European Health Data Space Data Quality Framework, as part of the TEHDAS EU 3rd Health Program, contribute to the improvement of data quality in healthcare, and what are its deliverables as of May 18th, 2022?\n\n3. What are the key components and recommendations of the \"Ensuring Excellence in Healthcare and Pharmaceuticals: A Comprehensive Guide to Data Quality Frameworks, Standards, and Regulatory Guidelines\" document for integrating ISO 9001:2015 Quality Management System standards into the healthcare and pharmaceutical sectors as of November 18th, 2022?", "prev_section_summary": "The section from the document \"Maximizing the Value of Electronic Health Data\" discusses the importance of making health data discoverable while ensuring the protection of personal information. It outlines the concept of data discoverability as allowing the assessment of data's fitness for a specific purpose without exposing the data itself. The document differentiates between the primary and secondary uses of electronic health data. Primary use is defined as the processing of personal health data for delivering health services, including treatments, prescriptions, and related administrative services. Secondary use, on the other hand, refers to the processing of health data for purposes other than direct healthcare provision, such as for national statistics, education, teaching, and scientific research. This distinction highlights the broader applications of electronic health data beyond immediate patient care, emphasizing the potential for contributing to scientific knowledge and policy-making. The excerpt also implies that in the context of EU medicines regulation, ensuring data quality is crucial when utilizing electronic health data for secondary purposes, to maintain the integrity and reliability of research and statistical outcomes.", "excerpt_keywords": "Data quality, EU medicines regulation, healthcare, ISO 9001:2015, regulatory guidelines"}, "excluded_embed_metadata_keys": [], "excluded_llm_metadata_keys": [], "relationships": {}, "text": "[37] Data quality framework EU medicines regulation.pdf\n# references\n\n|1.|european medicines regulatory network data standardisation strategy. december 16th, 2021. available from: link|\n|---|---|\n|2.|european health data space data quality framework, deliverable 6.1 of tehdas eu 3rd health program (ga: 101035467). may 18th, 2022, accessed at link|\n|3.|iso 9001:2015 quality management system, accessed november 18th, 2022. link|\n|4.|kahn, m.g., et al., a harmonized data quality assessment terminology and framework for the secondary use of electronic health record data. egems (wash dc), 2016. 4(1): p. 1244.|\n|5.|healthcare data quality: a 4-level actionable framework 2020 [september 5th, 2022]. available from: link|\n|6.|schmidt, c.o., et al., facilitating harmonized data quality assessments. a data quality framework for observational health research data collections with software implementations in r. bmc med res methodol, 2021. 21(1): p. 63.|\n|7.|sentinel qa program. quality assurance - sentinel version control system. [february 14th, 2022]. available from: link|\n|8.|nestcc. data quality framework, a report of the data quality subcommittee of the nest coordinating center - an initiative of mdic. 2020 [february 14th, 2022]. available from: link|\n|9.|the national patient-centered clinical research network. pcornet - data quality framework. [february 18th, 2022]. available from: link|\n|10.|duke-margolis center for health policy. characterising rwd quality and relevance for regulatory purposes, 2018. available from: link|\n|11.|duke-margolis center for health policy. determining real-world datas fitness for use and the role of reliability, 2019. available from: link|\n|12.|data utility framework. available from: link|\n|13.|statistics finland, data quality framework, national data quality criteria and indicators. available from: link|\n|14.|cave, a., x. kurz, and p. arlett, real-world data for regulatory decision making: challenges and possible solutions for europe. clin pharmacol ther, 2019. 106(1): p. 36-39.|\n|15.|big data steering group, big data workplan 2022-2025. available from: link|\n|16.|fair. available from: link|\n|17.|good practice guide for the use of the metadata catalogue of real-world data sources, ema/787647/2022. available from: link|\n|18.|wang, s.v. and s. schneeweiss, a framework for visualizing study designs and data observability in electronic health record data. clin epidemiol, 2022. 14: p. 601-608.|\n|19.|european commission. can we use data for another purpose? [internet, cited 27 sept 2022] link|\n|20.|american society for quality. what is a quality management system (qms)? | asq. available from: link|\n|21.|guideline on computerised systems and electronic data in clinical trials. available from: link|\n|22.|data integrity and compliance with cgmp, guidance for industry. available from: link|\n\ndata quality framework for eu medicines regulation ema/326985/2023 page 42/42", "start_char_idx": null, "end_char_idx": null, "text_template": "{metadata_str}\n\n{content}", "metadata_template": "{key}: {value}", "metadata_seperator": "\n", "class_name": "TextNode"}, "__type__": "1"}}, "docstore/metadata": {"b01e3aaa-a296-4d1c-bf00-9f8c0d1d55dc": {"doc_hash": "401638c9098929b875c9d3a30543d344012dc15870f9ae1510048d655ab516fd"}, "21d7a945-535b-49dc-8671-d9a9990671c0": {"doc_hash": "f145129abbac8cd94fd3def21cc8fcdaf71b41168f7049819028965d95245e9e"}, "e43a0b04-d6ac-4139-baef-99825e68067b": {"doc_hash": "c984e3dc466f91d01a6ca7b54f3518d409e23ad98cee051e11e3e9d7564d23f9"}, "16bf1a23-e3d2-4957-9f1d-f8f31d3c6e49": {"doc_hash": "c11656185f5b67151bbdd6392be220da81a8a48bf060f9f96130f85c0b4b0dc2"}, "65ef36e4-4f0e-4cad-8942-0539434406bf": {"doc_hash": "6ff2f08bca607db37f093418f9f9b84dcaaed1707b96108fbbf8db4eb93fe48e"}, "71d3a657-f6df-4812-ba3f-254422b331f6": {"doc_hash": "55d2fa4aabb143f2dbed5217cb6bbc70277b3f21f13a85492eed2714a5c39ce6"}, "23cf1f26-a44b-45fb-84ea-3744a5fd1976": {"doc_hash": "4703a93e69580a0b9b44050641a138097ec62d516a265f758b9ed3a59c591389"}, "48480ece-854a-448f-b69d-c0ad1c7b5e5e": {"doc_hash": "bbbe42cb7356acebc6315f928642bc61ee03284ae1eb425fd75d813d61a68474"}, "bd053438-7e91-4bca-a4f4-c55e9336f3b6": {"doc_hash": "550bdd504cb74239f34cb8d77e04c477977410faa1367f239287993d55a0bfc0"}, "da968890-4b13-4459-955a-2681f559ebb3": {"doc_hash": "e405f70165d3b8fb9781c35fbb7560a27feb3738192d4c8487f8f813911d4b91"}, "5ef4be60-6d44-4f84-be26-b76fea0880a3": {"doc_hash": "05a2266d02994a69f4719370a8aa4a7eb2de611cf110943fd3dba64039e4df6d"}, "f1b2a0e4-a27a-4627-8cbb-3aa7ca126dcb": {"doc_hash": "b51208c1cf88cbb985a4d184e837fb7a27a0d445a1e3dace7d70a46b8cc7812a"}, "a4377e1d-fe2f-4be9-ae2f-e4726d3bbf2d": {"doc_hash": "32e9b1a592633c46abc2ad15e787b99857d822fe9cce0a6e22bd37dda037c971"}, "cc4abb48-14bd-4b06-8a5b-681e65453287": {"doc_hash": "3cf0c266d75c59648a7169c42e56f996d596c8c99b4c57f11bdd419d779919bc"}, "d6da7767-6165-415f-a3ea-82df581f8b3a": {"doc_hash": "f6052dbde5a529ab548d15944d89713873051799a0d33436511015e954cb24b9"}, "7b8554d4-6f72-4e3d-a971-42029930f6ac": {"doc_hash": "a0aeb72482876ed6a178117f96abd4962f988f0180442620c96dddcc42b043d2"}, "7d85cbf6-d778-4905-9080-8183ecc08727": {"doc_hash": "534671b21c132ed005ada78b3a292054c27e4c798cd48b27a7bf2f050b09189b"}, "0fe3dabe-68dd-4e4a-92d9-555a6ce7b73d": {"doc_hash": "284673fd12fb9b16dc4a42c4d83a652070e141d5251dd8fbdee17f3bd014f6e5"}, "78d0b9da-712f-4fe9-95b4-b9217cc64349": {"doc_hash": "631ada55c2b779fa18cee3c2c842e67d36b7550faaba7a2adeba849055be9c4f"}, "3f6410a2-84e5-4355-8686-16804d3a0830": {"doc_hash": "9359189e0bbef089624ac9c5bbd404820dd07df77082ef8b0094bcaed0dbaa0c"}, "65d1f3ac-9e2e-47b3-a972-ddeba134a7f8": {"doc_hash": "6060c341042619b090b54d3890bde104cac29aaecb292b7a1bd9c7535e8bef0f"}, "9f5f47bf-6582-41e6-9d64-ee03f81dfc51": {"doc_hash": "adc146f0865c0d5ced0a182418a02dd2bad24d8ab86e0846803e239a47a09357"}, "920df918-9815-4156-af26-be6bb4a58af1": {"doc_hash": "b2265844d554905d05a438b9f8097515756c8d65945c3e8e31d7131761d983dd"}, "f1b1dfe8-0efc-4cd2-ae19-568ff7edcd82": {"doc_hash": "d31da3640b3933d6f32dba031b20e7453bf1f9ffba0c8387859dbd1d680d48de"}, "8ffd9df1-3229-49ea-bb92-de671b651bfd": {"doc_hash": "1eaeda2d51a7abe3def1af3b5e165b04036ec83da4ebcafb3f736497dd7ca845"}, "1fbc8590-0f0a-460f-aa97-8da0b031e3d9": {"doc_hash": "ed1522711d39211b6380fa92290bc356ee17462ee7b5bac9fb1042486ec977cd"}, "a94cdf23-364a-4d67-9d28-2d67c94131a3": {"doc_hash": "9982f8d69dc5adb89d89a4ec80d74684e78b145137b47f345fbe75aa144e4df7"}, "44ad0e88-06b0-4b5f-a6eb-cabc45a62b28": {"doc_hash": "fbedd8f17a3161eaa681ad7b5a9f1b9d1d05f018d85250fa84506f0665b76ee7"}, "14a02b9a-d956-4a38-8322-167e5423e9fb": {"doc_hash": "6764ca1d8091d1b6a5c0ffabeb2c5775890517b7df980c4a326746251402e5bb"}, "195af384-c7ae-42e3-b353-95430498e04d": {"doc_hash": "47665d2dc863fa392aef1049d027ef9531cdb349a1d05cf933c4ace8528cca29"}, "44c8aeb1-1039-42ed-9483-1573b3927fde": {"doc_hash": "c7a88d9dc4ebc69404503e4de07f08dd9eb8ec2f55493d8d3a3fdae2e6adc986"}, "335ec4e4-f8dc-4d09-b9ad-58d0bbe2027b": {"doc_hash": "117e1342389eb2ed835e41736001a5712026e053f2cd5ab2d04d7d372cadaa9f"}, "e4de2752-21f0-4fe6-9995-6db55a5ebff7": {"doc_hash": "bbdf72ac9febf8419335c2f4678910fc932132c42c0084ebcb143f8623393310"}, "c6f7dcae-738c-42f2-8a33-5305ad0e1d4a": {"doc_hash": "d70d7463099c8a396736102e6933276f514dafeeb70ea35522abc597a53b6ccc"}, "a3bf2925-3d16-4976-be12-2c0620a261d6": {"doc_hash": "54d24840b1707f074acf2a00d26965f7aea7ed682ea44c06053645e863da27b5"}, "60fec87f-459c-4791-b671-57d83796da9e": {"doc_hash": "66d7ba1287df3bd5b8b4361e853e440e8af5b5de755cbb77d82cda21f7aa118c"}, "10435c38-1950-4f50-bfdf-0d06c25b3d68": {"doc_hash": "0a2e67f6d600d5c3eacb96b50734e56a83dd8aced7a02156d049c23f2e5e0a4d"}, "bf717834-4f43-4073-9eed-670cfcdde9fe": {"doc_hash": "2b58056f55b1476f15ace18d32c88dee236476de490afa5eed4ad0c7cf35eee9"}, "11298353-5d97-483c-9d94-a2b9d0d72763": {"doc_hash": "e4f9b52104a1188c311c1587f09263142b83fa44cb43cba033443315fc61e9d0"}, "909ca3a0-dfb6-4b11-ae7b-7c4aac85abbf": {"doc_hash": "3781af685767dca32f3cb17f310969537e4f0a379fa4e760c5a1724e5f4552a5"}, "34874db2-8d82-48a4-a1f4-ca790b227f8f": {"doc_hash": "7899feea4b72b0f8e815e5ee23472e737d20fcf73f46e6337ff4981c102bd730"}, "e4fcd593-2fa9-4620-b99d-6e5d20f21479": {"doc_hash": "fd53f3c9589f30f88e74e937fd860eb88ff1fcd937dafa9c3649b2c42b1b8898"}, "775601fe-6fe8-4f0a-a662-6f8f3cd2d4b4": {"doc_hash": "9ba803ef4abb1d77353ed75b21b7c703012bb3f457d67ed40394590a1f3091ab"}, "ae2290dc-dd59-4404-8166-d139f17b1515": {"doc_hash": "cbed1cb09d6729fd2a458e6f06ae076ec7d71080d6d1d8007a7e7d400b9b6f6d"}, "ea2a6322-9fb9-4522-b8e9-4bae3b25f1e5": {"doc_hash": "48bf6fdcb400ac959f516f3814a386300cb0ddd0d5066439c89827dfda4d0d12"}, "81d5f905-e73f-4c1b-b992-8b522f1543a4": {"doc_hash": "61d5a83f00f2b66d3318ce7bd83a890551c59270384f9ac748d4202296907bef"}, "c09f7dd1-d8b2-48d6-b06b-a9dae0ca0327": {"doc_hash": "6b1ec02b0ab83dfdfdf789019a31b59ac83b5a1a570924fbd5cee3c07a232c78"}, "c78ecd61-fff0-4d59-a17e-5386fd875cf1": {"doc_hash": "3812c645591168b6d751066140ae6856d3ce208bc47b8ce183fe8f0ae09b550e"}, "0431a216-8edb-4a40-b139-248dc972b3eb": {"doc_hash": "7a1a7ac5f8351c6eef3722f7b839f83aa8b69b2298677a119a6e83e4f3a340cb"}, "d51f7448-e987-496e-84c6-7e191f4183a1": {"doc_hash": "96fcec2d9712197f06a10bf1c5838ad7ff7dffd8397c2758ee7e17abf398ee4e"}, "041b4372-f64e-4523-9a0f-6bbf6765521f": {"doc_hash": "cd26ce0644534341626cea75e8abb1476460e0a144cc85d2783faf95e533a571"}, "f525b0ee-7da5-4291-9844-d470254b78c7": {"doc_hash": "acb5bec54611d034e44b139b08e39e93e81c5f3a428798722ca53638ae06be2e"}, "4cfedc43-b292-4e46-b2ad-caf13162bafe": {"doc_hash": "eedc33a813744bb9a33db95006f6299296367c2ce5b54017ceea6ba0b3bbf8c5"}, "e3fd53ba-55d0-4b4d-930f-fbfc3d2f2438": {"doc_hash": "34a9c1128c1acff2f08832de2bf6e56fce84a502a689e6f7a68d237ce2084215"}, "44b218cb-0fec-4326-9faa-bbeb552daf77": {"doc_hash": "6089f99cadbf760d9ed198bbc1a1598176423d6d000280ca7321a348875eb18b"}, "a22f9db6-0acf-435a-b74e-93c1b630cc29": {"doc_hash": "a0f70fef05b0c28cf7ee06a2262c10805fb7abb0d580d7b6d4668567dc36dd28"}, "f80e8a59-495a-4572-8f9f-771eedea8d18": {"doc_hash": "f6c59e3d7aad7319991029e6944a277d56981b498a1de8e9707a8efb1dd4869e"}, "c160d8b2-03b3-4231-a046-f55f948ffd7d": {"doc_hash": "6acdd5f24e31976b746a031a573766c2141a8afc948a89c72019448e4592f8c7"}, "561dc3c9-9014-42ad-820f-7edfbfc98e22": {"doc_hash": "639cf162aa7af2958f306e3f92b7babf4e6039a66f80321acc9aa1b94185d247"}, "53537c3f-4eef-4b6f-a8e8-9275a47d103e": {"doc_hash": "cc76bdc7076ecd8eb91edeb984bb3093bc67c8b3e2fe43b0a850d25c2785ceb2"}, "05cfa466-54ca-46a4-bbb8-305120d4f123": {"doc_hash": "daa4b3e070dac98bfc8f3fb160ceffa5bb51d7383fecde64f621e1abd9f3aa17"}, "a57744b0-46ad-46e2-ba7d-816a892c5078": {"doc_hash": "6793d3eb1f59a921becfaddd8d520aef6317a7aba51327ea74fd87db6f822984"}, "9f3bf483-edac-4661-9f21-8e17d77d7d23": {"doc_hash": "7e56e07543305f17d1221260132521be042f0ed4e6bd49a2c57cc44a12820d79"}, "d1b69d79-fbff-4402-83f0-822f80b23b48": {"doc_hash": "63a487fb3f2b7ca6bab6669f96367d8ef02c808cf9fedc78d2a421b65a40c70b"}, "10c1c554-e5b6-44e9-8d0f-feabfd7f9f1a": {"doc_hash": "d1e9eda017c559eca8920d777827287fbe9e30d920f6207c9f65781b19df272b"}, "8ce39848-39a8-4b03-bc28-70af829e635f": {"doc_hash": "99c0af9298d2fa44ef5f429b5fd21828282974f11da08c7689c7e2f4c4d86cce"}, "a1e9b433-0f86-4472-af86-975ca87dad12": {"doc_hash": "bf36c05944b6ee72e85523d52bfe08f3c9ce4b6270f8189023e2e00861f3f4d7"}, "7808f342-5d85-47f8-8af5-f939f8ba91da": {"doc_hash": "b563034103ab4c6b3d6d83c4f87b363334a42d40edd794c3c08f8871a79e0574"}, "c892b423-69ae-471d-aa2d-bdf0e232e5b4": {"doc_hash": "e9ddad41621abe0394fea5339fae7c4953a8c55a19ef9a594373122dd333d6d3"}, "8ebc4103-913a-4ff6-99f7-a605b7d1d042": {"doc_hash": "d07d263617cfef4aa917c75234890360d9741b46af4fdd60bb661574c55ef2c7"}, "27b21ecc-905e-4d2a-ba48-91887737886b": {"doc_hash": "a1b37830d8aa879b9174559c09e3d628a010d407f533da71a766fb4db6b092e1"}, "beae4628-e705-4ce6-a336-cda7042fd221": {"doc_hash": "a8b30a9e7081adfa72a33d93b2e261a762fe88a9b8825abf30d76e385c2d4e4e"}, "93619378-84b1-48ca-b38a-d682d4860faf": {"doc_hash": "8c66426208b405b60650655801273c9c1f2fad6ff7db2a7db26a73a3e9e3bb1f"}, "bbe413ba-4f7d-40cc-892d-267beb000531": {"doc_hash": "69ca69e1993678252b18e0f2ca8d2587b3f78220bde5b3f264dae596dc92bb08"}, "37a5e373-9797-4790-8c69-9edfea623658": {"doc_hash": "bc28118e4a9bf8242e78beb800ff7f586b3ea1ab6a8abd3cd4639835d32934e8"}, "02c4bc73-9842-49a3-861b-da9446747a81": {"doc_hash": "41f57e30702c43d1f729c21f59745b6bbba5871a7e201e5e6aa9f78f8430ee60"}, "9a9d78ea-90ec-49e1-b40d-ead014424cbf": {"doc_hash": "224046d6e8d8046c9fc570397d11ebacd18e3d0131a65cda04bef70aa4fe2b04"}, "22b4ec3b-e384-4251-8691-8b2093a39ad6": {"doc_hash": "1bde77884f452f51cc0efc365b439cf6bca86f17c9806e8133124336d93d91ae"}, "d1e3e987-6db7-45d5-9f73-4a6acd9e7a09": {"doc_hash": "497f25a0330668f35bbd661371ec00e5fee7337a738a252e812d0e6a8dc39118"}, "1eb6334f-6f33-4513-834d-7d83a20655ea": {"doc_hash": "61a4932bd807ee4ea4d37d25fb79e55423251198d8925aed53e205edc325ef7d"}, "3437af41-eb15-4184-b435-44b5393665b9": {"doc_hash": "8a63b2f1c11f5084aa4aaf6e5d01489852e3dd386ac1f0d55547a8cd5e7f6af6"}, "4ee1ebd4-dc9a-413a-9a8e-232958588151": {"doc_hash": "d8678dd536dce530147b542bf80e81c438eec8c056d6440e719c37cdb3d75abf"}, "dfb3836c-c403-4286-b25f-d9d3ddd6283d": {"doc_hash": "ba98242203bd471abaaf66268d0f64b393cf88fc1e1244daf5ef1abc901b3fd3"}, "da99a6a3-c1d6-4349-ba2a-30679a9458ef": {"doc_hash": "f0766bbc34b68d410fcc1a1cf458a6e44c2dfbc64eac1661c5d4126866ee05cd"}, "bc88fad5-d3e8-49e8-86de-5866b7df96c0": {"doc_hash": "3b4ebe832bf9882c4697c7ab9401921817448a6db65f69d092ddbe34dec3840b"}, "b32a5ca8-89f3-4f6a-a6bb-898036702a76": {"doc_hash": "62e03b0bd934993b479000604aa3739a59b789596e5a10abde31adccb0464a81"}, "5940533d-c963-454d-8d56-7c03b5edb2f8": {"doc_hash": "95f6bc615a484f5d0e54a2a2bf4968f381e2c897debb69545a56bdb1ff0eb712"}, "cfa567c0-1d26-432e-9dae-19bb11ce53e4": {"doc_hash": "4fdb9021367005b91f55e11b630ee27113761dd374cd8ae0bdb37ac0ab337572"}, "66114eed-9081-4807-b1dc-248f2406c876": {"doc_hash": "9ffd9d5c7c12d9f1b174ade1f3d859dc2675ce8975c14e3823cb6ea31afbaa7f"}, "ca5977d2-547f-4300-9767-f9abcf87b5c0": {"doc_hash": "1c5aed5672fd469c9cf5e4abd68994a54b40515aea55108757caf0872e1cf3ed"}, "8b322373-677d-4e65-b3de-5dccfd4e19dd": {"doc_hash": "cac970413ceefb59494cd4624ea0baf38aaec247ed1998839250eca2a795280a"}, "dc77f025-ce42-4dd3-90ed-697d15a7e211": {"doc_hash": "bfe66fff7ebef08d0bf4bd47dc06cc9058230c85b30fa7219ef99f3307c03d04"}, "35a2dc57-6271-455d-872d-2aa8263a6bf0": {"doc_hash": "30e323fcc13586c51e2bf56f3cc8be9a532b532803b28ec3388bb5f1acfa061a"}, "d7a165fa-89cd-4fe1-9e7d-fe0bc87a7c95": {"doc_hash": "bacdd78b666cc6487ebb7f18939cf661954e25e86dde3b550e572c3d513b479f"}, "f91c11b5-0c3f-4610-89b0-6aadb5b1ad03": {"doc_hash": "1e07f837e3dfc1bb83e719f993e2231374121074c11a1f717bf3e098c69c72ab"}, "3815faec-ea86-4aa0-95f3-681be3b929af": {"doc_hash": "7a4f7f63ed04445aca8d80cb29f360ff4262f2cc5e1b4c3bf30ae03cd541e0c8"}, "e38dc33b-2925-45fc-a9f7-14c35ea939f7": {"doc_hash": "1e2157cae0e50c19eb2f624b820f8edfb989eb28df07e87f369dfce76cfd51ac"}, "cb6d1026-38e2-4052-9257-e4e65ce191d9": {"doc_hash": "beb3b13151e7aabd91935c0fbc949fc22fbf044370bf280ce7217235a57af8ae"}, "e82b6d27-f513-4cb0-bfcd-ac1653a313fe": {"doc_hash": "87fcf88d3bd64b17d41914063b14cbe4558bffad21d6a93630d0089efacde3d3"}, "ad6d2f7b-d327-4a95-b069-8f175bc20f6f": {"doc_hash": "9938cb0cfbc9ff15cdf9e993df83f50accef48be3b6ca0c00c5d4208eb978d59"}, "03f174e8-d52f-4363-9e75-e4e90aa599fd": {"doc_hash": "46c85c3f24a7a46c1233ec485800b78467806a531e9a4e302198c1a63b158f09"}, "fe66d819-f10e-4e0f-a84c-876bcb2be3e2": {"doc_hash": "5d2f987116225335b7542bcf4c0a1331c9469e73f8a5abddf1f3331cc048e60c"}, "14ec4584-b2bd-49af-af46-5afecab6b2fb": {"doc_hash": "fa64abeb18518d2b82532b00351982b81d7214fbc7cc1c33e0244e1084b58cd7"}, "1a7e330d-7fa3-4a04-baa0-736c1dbe2f14": {"doc_hash": "1cedc873bef4b0f16ea9a3e15fad30f055e8b384c63e392c2de8ea9cd8629c2d"}, "78960670-c5a1-415b-a044-c2235af43c69": {"doc_hash": "5dc96145182661070e2f65e9eb10946d5e61cfde8648c24ec8e4ff6b4611a171"}, "d322710e-a555-461c-9444-6f0d24cae694": {"doc_hash": "7aff250ba41ea9f58c7fd0a4446e531cd08a9b5d9246e26063612999ce0454de"}, "a6a154e5-1ab2-4f74-94f5-d2c0749f65a6": {"doc_hash": "1edac0dcd9440c0196f2a9f21fb1ba82d002ca455a1877e0212caf215b8adfb9"}, "d7c66be3-577a-4677-82ba-ee3dd8958744": {"doc_hash": "d186b8c2c73df524469e269e1f72fe7f1b8c4287cfc006175287d65e4e7777ef"}, "0aa15273-cc8b-42e0-bb59-170ee61d9ca9": {"doc_hash": "c8ee6000c285a890ba8d63195bc26271274c65df059ecb806e5838f0f59e61e0"}, "e7db645d-5c19-4cb0-8df7-d6869ef35dee": {"doc_hash": "20410f52f69a4bdbd55c5f395c096b5c4e766555abf8a065af73d77a20d64c54"}, "8a7c5ca6-d226-4a7d-8fdb-d806f4c5a395": {"doc_hash": "9bc68e88913360593d605d2809a9e856b2f2b737157d005c648122af5c0de864"}, "b7099f3c-5f0b-4492-a383-b481c98fef25": {"doc_hash": "fe26efa687b0bbe20131f3d412439b1cb4ec4f103e65c5e5d01c3a8b5e54663b"}, "3e806ead-557a-4a2d-8b41-69e31ee84077": {"doc_hash": "0afacae8ba6c473861c98581bffc9eac4c35006363c98594d4159c709bf9bc34"}, "8c51ab00-f84b-49b5-bd24-bf82a248d4df": {"doc_hash": "da6f7c49b2d4adb109e4ebd609e78eb6217b0c30e4e56ef1e38a7e2c83e09c8b"}, "bc8ec215-4c2d-4d2f-9e62-177ec1ed63cc": {"doc_hash": "63a112b02d2d593ef2eb0484f75d6a46503e88aae302d19a5ef72b7609e7c74d"}, "67a70364-8ea3-48e1-84fc-f1f5fe529b8f": {"doc_hash": "1f72360edac40c4d98b774a5182a09c73acad021235240568caddd5b22234c7c"}, "ce1337e4-bc80-4895-a584-8071acb6cdd9": {"doc_hash": "57f0cbf86017569cedbffb7f5dae05df84592f4b4a5db996b0be37ef32e25b35"}, "35a90664-7150-448e-8233-0b8ad3a23ae3": {"doc_hash": "6f832a454a5924cb877cab798d33557181aba10b9b4266d4bac27bf757d60776"}, "3e379a98-9ad8-4add-9075-eed3bd912230": {"doc_hash": "358255969bdb716d5bcae83315c265f836298826df03a5e1b07adcb653c6404f"}, "03f0f302-eb45-45d5-8b1e-29db9e65287d": {"doc_hash": "05967b8363623acb15e2358d5e25a8905b002a0fc22d0a277f4dc8090bcb1f8b"}, "0bc9ebc0-1c0b-437a-b4b0-b436ff1868df": {"doc_hash": "6ca1d3cab41aa8e49439a44001412fc83da07fd3b88b7cb38fd476070c7c794c"}, "18f6e04d-1430-4ed0-a299-f71d18af1d30": {"doc_hash": "b31692ad9e1ab1941da61c4d2eb5fea79a1da8855490fac85a0bb908f1393220"}, "d6a029c1-05fa-4dc5-95e3-57f6b947dad8": {"doc_hash": "83fc9714f35dcd019050f5418a405ead7e93bd6d5156d266d915a3f9c2ac24a0"}, "905b1b3e-0b50-4c24-8d13-cbcc85d03d6d": {"doc_hash": "2f314e281e5c1c5879f550987bf99aa5cfe7c84951a287f705e1b536bbbb205b"}, "301e6f30-7d78-4102-b750-cabe09bd3c62": {"doc_hash": "019c9d510d9250784899f9c07c2fd74a6f8b3b2a9b3221d9d0ec794fa41e5ef2"}, "517f5f8b-5593-42ea-bc0a-d569113cebf9": {"doc_hash": "7622b30e78fe8c208969cc142af774426c27449a0152a99e0330a8e98bcc0768"}, "19e5d56b-dd21-4abb-8bf2-f67e8778b87b": {"doc_hash": "00ad0abebe5b14322c9183b7dc1133aaffabc1a44bcba2417dd6eb1fe040c7b8"}, "59d8196f-63ef-4eb5-950d-9fc453590c4c": {"doc_hash": "9a0a18f97e3b6fd9984e9dee72a4fdefb40ab981e40e55309488ef6b1e31c3da"}, "40caba99-fe3c-4fc7-9ac6-673cf1db30d6": {"doc_hash": "da0ba227cca613664784547d73e6a38a5b444dfa9ec1f1c6100e6cf7f20c90ce"}, "f720f921-b093-480d-a2c4-584c3c83be61": {"doc_hash": "cec4b2f6befd056fd60a28a37110894a3628fb84cf9658c10dc9c978fbf75188"}, "f3a6acf7-e14c-4729-bb46-c6e69c152f01": {"doc_hash": "bb002798537d3be97be678adf1c8f0329ceffb2be22024ffe7284a8c5c28658b"}, "bd12b854-bba2-4ad5-bf1a-b3253b168096": {"doc_hash": "7b615898b74de0902fd6c5cc96b587b146bfba643cc84623c62ad5ee9e2fafb7"}, "e037f0bc-96f7-472a-97db-c9b41f14e7b6": {"doc_hash": "13244023afe28201152f9ac2dc74316022a928c2350b73c1f1190ad3255a1a6d"}, "fd9d8f11-c7af-4d16-a930-e1bb66ac67bc": {"doc_hash": "e3e13b1aa90444c34044fdf014cbd41f1e80f3518d529882c436b446262eed37"}, "87c458f9-096e-4a4f-83ee-3b95190fd058": {"doc_hash": "dc7c081d0039b9e1aa3bc683e5fa59127ad6c52b66d554bfea8f41bc3ebe60eb"}, "36628614-89b0-42d6-86bc-0485980e0306": {"doc_hash": "720041f4abeacd7db3cade0b60bf6532ad7ea69a515798addb64bd4edfb3cb3a"}, "e0099462-f639-433d-afc1-b067fd19543a": {"doc_hash": "2980326cc8b0e536712510611df5d9114a485c119f5bb46f42ddcf0bd81cea7b"}, "1e95186d-00fa-4d71-aa1b-9d58ee08bf1a": {"doc_hash": "ac143a18e6184d82d1f9ca21dc85bdbb0dff1ef339d96ddb8338f1adb613d412"}, "97470139-e2ed-4d04-8383-a1d76a2745f2": {"doc_hash": "5f36d1d1cc49d94d40f124825b1f14a19ad30d2a8fc72d5486c2775667ee1ae5"}, "5b06597f-8284-46fb-9a88-17ef88db1f57": {"doc_hash": "d008f7ee3a4aa0430c4c6327d56ac94f5f1deff1f8027d8916a2118d325ec30a"}, "f34861e1-e363-410a-8204-7732c4366937": {"doc_hash": "5d7c6513217af8feb31eca9609116acf5da5d9730a8936a3362328fb0a88d4a1"}, "0d6788ef-7977-44b6-bb24-74ba5e887a69": {"doc_hash": "aa7e4fc82debb2c57a20acc3fa4a15d5f6db480eed64a55cafefa6f0fca473d5"}, "5a2f3111-3656-4477-b90e-6544462461c1": {"doc_hash": "6833f9dd71468e1d5507ae4302f30504349de69963104035bc36d60b98945da2"}, "5a5f05bc-67c2-4d0e-ade5-82af60facff7": {"doc_hash": "a4a15cbaedc951f1fb6dd3ec27df5e381b5a61caddd7ee58507341e03b67a350"}, "0b590134-d76b-4862-90b1-5833446eb91c": {"doc_hash": "08141365946e16ba87e87c12060e2af85e3ae664683776ee8b2a87cf31fe7701"}, "139a33c0-43c2-4105-951f-39ba76e1a4f2": {"doc_hash": "db25ae94a8cfef742b0c7203a9c8c093d6e8a44a58a94e9f8f09d3a23b85fe3a"}, "88165dfa-c7b7-45f4-80f6-f27dcf603ed1": {"doc_hash": "38b291b3e6bc71d8dc2b4cd254022e0abbaa8d64dad38b5f46d4b2ab0045d82c"}, "2416b543-a980-428e-8b65-b59ce5e8e854": {"doc_hash": "918150c9279a37a894a916f0b71e7a1023b93e73800ef516fda2ed3049cede22"}, "82eac73c-42df-4a02-b150-f4f8a701ada5": {"doc_hash": "fec41df5b073c91cbd524f33fe2d490fcad75f55d2825d8a8aeda4a981600c3d"}, "ba016021-8d67-4488-93e4-501c0a5b7d7b": {"doc_hash": "c0055194d1f7da9e40e94141cc8af7b088b7266de0abd5ce2b2a23324a023069"}, "b1136ef5-0a9b-4b98-bd7e-e9f558c818dc": {"doc_hash": "2542581731dc70f3a01c557c23e9313c9c979412b758d34c7f323065e8ed3b21"}, "d391ee38-c10a-4005-9b69-e3cc8a255671": {"doc_hash": "6468b12fcc01db7ce93a8f2857a82a86f67d59048d63070f14f137fbbc8b8e71"}, "aa745544-1110-4eca-a641-489f2f1a44fa": {"doc_hash": "6050cba6d6f685bb44e3afee9fdf76907a66a3fe48e686931d59862ad4919759"}, "aeb52f72-31b0-4786-b8fa-9f0107928cbf": {"doc_hash": "0722b4e203f3585cfa0ebb7a206174a5a8673be55e3c0c744987e434bd6b5020"}, "c49d5b70-70ca-4ccf-a470-0fc96cc0098f": {"doc_hash": "aab26943b8fbe50e25f86729f32af382327855a7bb53fed0cdb75f58dd1e62d6"}, "053ac88c-7713-4384-8777-732db8b2d85c": {"doc_hash": "78d8a07888ec4323f61b57885eaf61b51d955c1231fbf1f20371857861ae02ea"}, "f0a3bbce-445a-43b6-963c-b04145a54a4e": {"doc_hash": "dba999a98ec526c1f959bdcd787e11f4c65d700d7fe991478940d549ef19ca85"}, "ab29837b-8333-486d-a14b-691acf3491ce": {"doc_hash": "3ae60d440360758cb13128880d75545e82fe0d9506453b632ae3766a8280f536"}, "3074e5fb-ae43-45f7-bc05-1e124a424e15": {"doc_hash": "9e7d64fa00bc280d931201c098d8735c8b93c08579959bcc5b102e9d88541f80"}, "265c91d5-129c-4172-bee0-de86d9f76028": {"doc_hash": "19818ba5cbd8d32a704ff3f9f0d9f793a6a9bf4726175127e0b43922aaad1071"}, "21e122af-aea6-4d94-95d6-35445476f784": {"doc_hash": "97c5edfcf1bc55c36f40813018e6b573667cd2823acda5aea6f5979e8d3accd0"}, "861e2aab-9812-4285-9849-7a1250c34b8d": {"doc_hash": "6adf8120dc0591ffed634582e949e3e5ea7a8a3d172b3f9e22a29427653dbb43"}, "6706fe45-5896-4fae-8559-315d05704ae9": {"doc_hash": "091cc119efb8604ca37cfb6815659de9ce1966c2b9212271af216c291162aa5f"}, "036b8345-a569-4aa8-bf7a-02180293a215": {"doc_hash": "15f605dff8a75d4fcce8dcc402aaa1590acb3f6a13d43db0c1c76e40324a1d28"}, "72e154ef-aedb-4daf-8db1-ab7564845a2b": {"doc_hash": "b4a23cad753a0f46717bbffc51ac2182f0453973430d9adc75c9d6cd1673ae54"}, "1e70dfad-28dd-4044-950e-3542fbbc8ac2": {"doc_hash": "2fd0a32a0b4dcd1aa7aba2b771b32fb98894ffabbaf4150ef978e0a988995b28"}, "e4c238c9-b200-4e21-bf91-7fb83ebbcfdb": {"doc_hash": "8b723b1a6715eaa3ba7e10cf13f92851d06e96c321adf52ba6e091e5bd4b694e"}, "ddeb2730-9672-403c-b80e-d0050bdf6f97": {"doc_hash": "905af1af0d7b46c312e7e10bde4287fe520487673187690613a5ceca073f0998"}, "3cf6b347-89be-4061-91bd-30d2df0e7767": {"doc_hash": "d99111476d403f161e0459ecd01e5818054a187e8987a3537d485c57b7ad87be"}, "5969bf76-79c6-4918-9bb6-f22512fb29a5": {"doc_hash": "d1cb21e6ca1fb8ffe447a4d6626835302a08ce34d72d2eda3db4851dd0a001d0"}, "157a14f3-d44c-42b0-894b-837ddc9472a8": {"doc_hash": "5e79e0781d15a663080b740ccfaf6808cf73cf308f05d93b1bd7c58f24df2065"}, "8f94bd99-d6a1-40c0-958a-70581ac48e1a": {"doc_hash": "3e37d18d7c64b6c7f19df92b965aa6c49261196ecdd849fad4061d7f5f17f654"}, "40fc4333-240f-447a-91a0-6bff2045d0cd": {"doc_hash": "5ce594984deb4b35e33ff92242c9e0b6f69bff31acee838600042b3c2a79743f"}, "b58cf4fa-39e2-4bd9-b189-e40517a421c5": {"doc_hash": "b9da012daefd8b9d4c5a53dd8729bf55d0eb0d8a33678f7a509881abd356a84f"}, "c1d5c2ba-e6e7-4313-a340-95be0d8f5f9d": {"doc_hash": "c7fdea9e324a6f0a895e9667c16e42d4a4e4e9d6d233cf4bebd288d0458fa3d9"}, "b145562e-7be1-4108-b091-ebd9b955c305": {"doc_hash": "8eb4e2476449c0240a0c02317d78d4fe084da5a47f59fe5c75f62816a09210bf"}, "7ee09460-f4ff-4f37-baf7-9abba99ac13f": {"doc_hash": "71be341509a411c8a6666e1794228eeaa1e7371cc6ef25035b526fdb60c01705"}, "5588a1bb-7d78-4e7d-8cfc-320ffbc00b88": {"doc_hash": "74a48596a37f74ea7954a23d78e0a66a2a1b9a6cec59c5377ab6d1468ecd84d1"}, "296a20bf-8a0f-4e1e-a05b-06636a44baa2": {"doc_hash": "d57cdefcb970cc2c324c5751fbd63e440aa072427b88b4fdab7f3e71cb3eefb7"}, "7f27d2df-911a-4d29-bce2-95cb0f909bc1": {"doc_hash": "847eff2058214aeb88af94fc709db2c93bd3362ff40ef5902a4540da1b406fb1"}, "64f02ceb-b807-45de-b299-1f01f7caefba": {"doc_hash": "d386b44db69721fe78e196c9076c31a65cf8d7b53d446592510cf5e82b6027a8"}, "79cb2315-52bd-4428-a60d-b41ea629d01d": {"doc_hash": "469fb89e09a3e56a7d000b3bbac287447dd0f3940dbd0f5ed5c85a4e46aa4089"}, "b2315260-a877-4be4-82c8-9f63a64d87c5": {"doc_hash": "a38d2a61af1d7e1c54bacb12eded08f6a7c348702203575e93d2a47de45579af"}, "152f6ac3-d28f-4eab-b030-9d05f81021ab": {"doc_hash": "799e03948a7d7a84454722a1a47cb9c7873e57471dfa12ccd91126441045f8f1"}, "40b4edd4-40c6-4a24-b378-63d0e8220a5a": {"doc_hash": "98bf3e2830c46460ea65b10f9b021873410372d056652fbbff9adab7b6b33f0e"}, "682dd507-d1be-4d50-927b-96487a76f149": {"doc_hash": "ccbdb754e70c2a421eb6ec9892b06d76716ec9c4dab6b5986f81dacf1dd04ac6"}, "2f10a780-6d68-480c-95e8-4b48001aa801": {"doc_hash": "9870c7457affc6fa06771d6d0bdf4ad985f507b33f42bda9efd1ad8d86cdd991"}, "a2f918da-bd69-40f8-8cda-3dedefe1ec37": {"doc_hash": "6e486521e2e9b959a6da76b893aebe812f1f51a2935100f10d09cc66db3e3d82"}, "c35dd562-e75b-4146-9650-6a7cfe89cba3": {"doc_hash": "e9d42058d64c028b6ccc1fd40629cf1bde3b8d70e9aabf6f238a6789389c38e2"}, "d2548140-6d4b-4fd6-8849-91d4e70cb760": {"doc_hash": "ff31c28587d1730cd9c0991c56ab899a0cb97c1fb8373483d81369c882dbfba4"}, "1a8930bf-4fd9-4a74-af3f-4580e95780da": {"doc_hash": "25112b6ec66317452e3506d248cbfc33cbf32602b5955ea4daa819ff7cfbd096"}, "de63c400-c1db-4618-b917-0c78f43ea26e": {"doc_hash": "b52c04a2afd23987ee4a9cc07fa8a812e2789a3978d412a7e0cae5cc9dc326a7"}, "86f1119d-a77e-4af0-b681-7897a7cbb18a": {"doc_hash": "b9377452f4bda3526253bd7fb45461be6d89106d67e83e39f680d82164f34475"}, "fad4750c-084b-4e1d-84ee-e06c9295a3fa": {"doc_hash": "31b6c8edef43291bd97a85fd53aeed3a77e009662f18272548d9fe0e61321d9c"}, "47523fe7-3a09-4a73-a211-4ec70082c17b": {"doc_hash": "b55ba1a3343fdb6ae4a16ef1bb40f195cff3817fa794d654323e7d2e325b5807"}, "11fee860-d56f-42cd-b25b-ddb1cdeff9c9": {"doc_hash": "f7d6b732b7a8949a7d9e84394982fb5cd66f90e90692dfb60fac3da0fde9d311"}, "97cddee0-5799-4d08-935b-386befe953ee": {"doc_hash": "b4efa4b2a57e66142c27a1b6f8df5fdce9d69cb091258b641b1082d7e872bc07"}, "3ae528d4-5ec4-41df-a9fd-0ad5bca070d1": {"doc_hash": "5634d740de4b75a12998995da2b96f71ec4ca27b01ac18aa45cd550e6bede5c0"}, "08e8e0ac-89dc-4fe3-97be-080267d86f84": {"doc_hash": "04334df02e6e83cde36f0512923953d0b41e0446526a31065c3e7a5d8a824859"}, "3e06f4c7-dfff-4bce-8fab-fa6f44831c15": {"doc_hash": "7cc734edb9024b6c675246eb41d225001abefef8eb5d9298f7939a0d904f3198"}, "2cd0847d-0817-4d1f-8621-817ec70791b6": {"doc_hash": "11d6420d662345a0af69f5ef9390dea134319c5783381706a2cb0160f11049e1"}, "f7a36bc3-a6c1-4aa3-b687-bee182117fc8": {"doc_hash": "7cf9ae99b887b7364967e629dd8169fbde1bb33d7c08bd76c863309d53bb8795"}, "bb326f96-34de-41f9-8ff3-a2d24f33ac30": {"doc_hash": "163341855b542407669373169923af01f895274eb2315a5f869a28f9b43435af"}, "984c7055-96fa-49a0-b4f6-40c4ca0e711b": {"doc_hash": "2a5cdb2e9e9eb4378b46e6f389feb3059866b6bb0e1fcbe33609fc6d389a6cba"}, "77c05a51-f23d-46e8-baca-d72546adb0b9": {"doc_hash": "03276f9d5d1b2e82cb85ae4f70acdd091744296bfc941942b82200d50c43ffc3"}, "4bf7335c-c91e-41b8-8acf-37d2432584e8": {"doc_hash": "07ecf1898b8e4895056ee791d24e25be9e4e71e04f488c9a6d4634d3e1818bf6"}, "88d6f346-ea94-47bb-bc80-1f5c0cab012c": {"doc_hash": "f296753dc4a5c502de18563cd0cc6446b65b61a81928451a97f4ebf9f04c3272"}, "86be8a96-629f-4030-bb1f-59e9f261d6d6": {"doc_hash": "207c1a9288e19c87915b9de877b27bd31e4e97684bfd7b4ef9a1daedc875b58f"}, "608941a5-268b-46bf-81d6-d18c813af6e7": {"doc_hash": "e2904fa1ee9a67e4ea54be25dd077262d39885e0c8f37d11c459ceb9ad7af644"}, "16353cfb-da8b-4e0f-a54f-ec8d73ff5fd6": {"doc_hash": "e67a9fc901e825bb73d6c7e267027d20a914d9d9bfd5b4000ef814b39a44cf44"}, "d01520cc-d5c5-40b8-bddb-9ce0742bd721": {"doc_hash": "b9537ab6093086c348f9c87218e7101fa755c1a196404f798cff36b542784e73"}, "0a0e8534-984b-4b56-9b9b-304eb4404cb7": {"doc_hash": "08e428f0f7e31859434b2d013fa04640c2f626c842fc339d29ed90c2020e332e"}, "2b3f3311-511b-4e6d-8a0b-0b94e0d80812": {"doc_hash": "5182c6738412f2833fb02436e9c4464d5bebb76b0d336b4cddb3b859d41b2549"}, "e9a78a30-404a-439e-9818-5eec87992c68": {"doc_hash": "a49fe1d5acb13c38ca3ada7b535d1e009ab0c34cc568d9f81ae8e81c0ae24f3c"}, "a64e07fe-146e-4017-9281-f28ff291ece8": {"doc_hash": "8df933b129e39f2750e4822a927bdd66d49338c4934bd0fd2d2f15b44ab8f059"}, "45067ff5-7d8c-461e-a6c5-df6ec7bb1f8d": {"doc_hash": "ebad8f84c84d92de5deaa48ab2beb8d024d85c0de751a9ef61c7e8752ae79830"}, "6bcdb43b-10e8-4484-8ecd-611a0c8d860e": {"doc_hash": "7cf9e18edf9d65a7606ae82c285ce1b85d30817da719f39f66bce6d0eb0a1394"}, "64c8962f-e9b1-48d9-87f5-e5c326033c72": {"doc_hash": "e134f49889a1722a8ac13df4198a2d97c65b98acd380b8030a270b2eb90907fa"}, "c0137e30-77da-4aed-af8b-5b426fffc987": {"doc_hash": "cc7261b215598c5aaba1e3b0d10542225efb40fd471327321b62f71c819becf4"}, "425a66ef-db1d-4201-9ff8-96df1ed57937": {"doc_hash": "9b8001583bbbf16fa2582527cbb572502e109c9b8f4ff303f225538d3adad9bc"}, "70f13571-b355-456c-a567-9138bce04645": {"doc_hash": "3b79b12dcf499d73bbc813e4486bbecb38d8d2763487f93971fbb7ce0adfeab8"}, "715ebe03-a2e2-4e58-9286-0d4ec4829de4": {"doc_hash": "7fa0bac6e957d22aa2cc9f8a0c75d9ac4019d5785a17eda349cc82439acbf011"}, "5d09bb37-86d1-4fe5-9e67-f4e38b31be7a": {"doc_hash": "37848aca8a0ad27c4c2658cf0d9f4658a5a7307bf3978ec22e41941455115419"}, "9be4cb6d-10b8-42c3-9c61-740d772e26d4": {"doc_hash": "1b9856e4dcb1c288b746088afe54a6484bdd3df1e9d9fba1dad32911963cfd42"}, "a0d101a4-286c-4579-aac2-40278be78b56": {"doc_hash": "54d3ed556b0cb5fb9740c21ea05ae19055fc9d9a3bbd578e73b15a0088b6f6f4"}, "c0e91153-90ae-4095-bdba-f028e388dc42": {"doc_hash": "4539b466e91cdb057502bd14e8ecec6fcfddba7bf092381742b6aa8833b82767"}, "6d2428e2-11f1-41c5-9a4b-871028fd5df7": {"doc_hash": "397d78afe9958283c7a6bea042c9d70826ffc0ae74013f2bf5f26f838c0247da"}, "df6e22a6-4d98-4305-93f0-19d675691022": {"doc_hash": "c9a72476ca00ad543f227e0927058d73c8ba44762c3a4e4767906c30bc3a4d01"}, "a1636b6d-58a1-4f6f-bb1c-60f90b9d771a": {"doc_hash": "124bd8f5ae11a5c4dcc1996524a98b5b4724db9967294fe4cfc2992de3dd986c"}, "e9f4e436-7d08-4e52-9d61-79d9fab06385": {"doc_hash": "302152a330647cbfaa4583e56777309da704fe53ded4e95d1ee38b49d5690dc9"}, "1068dcdd-a59b-40cb-b46f-a3b4c70dc523": {"doc_hash": "5cf2db0bd55dcc1fd720be77077d6e83f24593ded3bc88ebff7f87926b4294d2"}, "eb80239a-fe08-43c7-ba0b-62e7f006a355": {"doc_hash": "7f3108694ea4d1023247782fa014cf05ee02a2f729290d76d617814a5769aa45"}, "6907425c-660e-4aff-9b60-3ad05c6dcf97": {"doc_hash": "4a4ea64f5403ea5296478b555e2870ff6ff3c06d61c9ba55fe68dc764ad37f86"}, "915b9556-ef46-4c11-8683-ec7df1062351": {"doc_hash": "b45db2506aed4004ccc1324c6356adf338b548ca8e4b84128cc7223c4a773faa"}, "30b9c515-3082-4fd5-b577-481d92c8ff24": {"doc_hash": "911b2c20f35afcb82da5b73e6750a34955617151ffa1814e3b89379bc91c2696"}, "fcfc4d72-d760-4179-845f-57d890accc4c": {"doc_hash": "1e192cc0a39e39457438ba6ced3bffecfee2d3d4d5e0fc9461b4b683c389c3d6"}, "5089da61-cabb-48d8-a92e-25e1608fe005": {"doc_hash": "db411feb7c9e123471a66f8dd39b29b67b749b6f25759e0ccfe1be518ade6632"}, "3be8fcdb-fc57-4afb-8214-3486ba4f76f5": {"doc_hash": "7e90f86569a41a606e0b4c65d7e235a5c62ac18b0ba360dd13a7c04e1d4b04e6"}, "84022bb6-1e17-41de-99fd-d2ca927fb232": {"doc_hash": "249b5ec39e9761d2b69ad32213565b2bafb90e18e6c07ce924e67e50f42ffcb0"}, "a69ca2fc-6a9f-4dac-902e-2035b92c9f7b": {"doc_hash": "cb7725f25172b9515c052dc6af59ddca3003eb9df720337b77e55e7b69d6d56e"}, "32ac266a-f9a2-48c6-94f5-ee54f90bd4ba": {"doc_hash": "d72acbc04d676483e70ade72178765d26da0e686564789e18640946985818ba2"}, "22ec3058-6fd4-483c-86b9-97cf91ace780": {"doc_hash": "7de1e5a9cd17c71892806d1c17e81edc6699dabab426ea433ed35a62ebb3487f"}, "a38e5cf8-2206-46d5-abe0-9ac38e286ec5": {"doc_hash": "dab8fa82e72e76802b077d1cdb32609f0830eda34b14b1e75d89e729132685f7"}, "0d654b22-ab0e-442a-a395-5966b3cb35ce": {"doc_hash": "020b607e06588fa9e13f21f41aa14d41f40f148ab46631a7de9e02ec023d0720"}, "d9605f2e-0fa1-4295-8361-c65ddd8cd11e": {"doc_hash": "bca0f60da74d40d4d9f69413ff474683dfc2590445ea19d357ace86f4d4c02c1"}, "a8f82915-1f64-475e-88eb-58c881cd7cf9": {"doc_hash": "38f5d5439ffb8e8c279aaae5ed0abe3b1ca12abe34767f90737bb9bc3ed0e105"}, "6ecfdd37-d09c-428b-b11c-bd02fb59c2dc": {"doc_hash": "d8a14e0e0bea61dbada2ce95987fb32f097f3985c27d83e45d317c716cd63d90"}, "6c7f01ee-7f4c-48f8-9fd2-7c9092028f9d": {"doc_hash": "89fd67c5e5f09b96ef83262bbced32a51aa8ed36c680eed8cad350e1cbfa402f"}, "24e707bb-2f56-4d73-9a7b-a9f3450736b7": {"doc_hash": "ad1e5524273050e3d5a4f45fdd5b132319a9174468fc7e453c947303d0df0b11"}, "071a396b-4d24-4dd8-abf7-d7a4b448fc34": {"doc_hash": "62816877a1d84cfd3d2729829ca08c2445e50cd3289924e614db9efac27d6b83"}, "4d372e8d-741b-43f4-a33b-2a76c8512a5e": {"doc_hash": "a109c31ae0dc5eccd11762ef83a6ec4fb514b163c46cc5229f6c080bb578858c"}, "a27dca37-0d94-423f-a50e-1c22fe7ce184": {"doc_hash": "1cfebf013b4d366bdcf4dea58e64b5aeccde1d3741fecdbb25c34399a505055f"}, "ccf6e729-0494-4b92-b3bd-d59732341a8b": {"doc_hash": "d5e68aee9b3d0468d0d3f186fd21322aa6d6902c2b28f086b55caa0a141e9101"}, "40204e42-2368-4d32-be84-98b2d7634869": {"doc_hash": "cc6627038a50cf69c075a42b29b9505ebd3be2255170328ed15178f488d407d1"}, "9721d091-235e-4e6a-b485-8a5197d2229d": {"doc_hash": "22cda4d30949a41c2101e76fe88c2b1cb63db192a7b9f53c51d89539569a2115"}, "9809410f-9132-4ac7-bc52-ec813fe0926f": {"doc_hash": "dcd5c9c7e352cd38b8a5cef2947177d4d25b5e2b88e6c145e0f5412ed2813ff1"}, "2380fc44-48cd-4034-aebf-74aa7961149e": {"doc_hash": "3dd01e18e40ecd9691383ab5c2a131b8834302a3a578d65f4e3be27c072f98dc"}, "3ffdb664-f1f8-4e2f-a25f-0a30de5cd50d": {"doc_hash": "0631baa86f9b70fa81a5abf314ff5367b5126a533a4af9e33e8b8ea519257b6c"}, "4935ef5b-5813-4b20-bf09-5fa0da6cb12c": {"doc_hash": "d2a29ce35a02784f5e5b07b48a68c3569edefb37e2f496325020a1a3f9283611"}, "05b644bf-3603-4ba2-9c82-912b1d8fe914": {"doc_hash": "e0eca76999bf32f23b28356e5cd6ec404ba29f17bc75f35279ffdd31ac705d30"}, "b6b66b2e-5c5d-4a50-b13b-3670965e5597": {"doc_hash": "ce69ca9d31c790a2c5b620052add1ab175460ca55c6d6c760dd514c1867648b1"}, "c11a918f-a686-41d7-968b-83ab87513f56": {"doc_hash": "6ba2ccbcbccdf1d738ed10ecd51c5b0df451b95f68d44f647364a3c979c8cb41"}, "431b4ce7-f2a8-4008-9943-51ee63013725": {"doc_hash": "450143ffabec5c21a72f8be8519199b04df838a7b0194c6c397d34c51eed2c1b"}, "5728cbd7-2018-4b09-9076-7b6146bdc50d": {"doc_hash": "06dc5fd4df0d0a65a7015ff5d9e2f61abf00b5e1806881b5355e4b7504301cd5"}, "8d810076-8b93-4240-803c-b4451848f4ff": {"doc_hash": "7daade5aecabc96d74edb22138fc7660af7b85fe378179b86da34817066eca2e"}, "31f4d830-13e3-4b6e-a3bc-73fb75c40bd9": {"doc_hash": "cc43c09634868a83682999421dbbe5a230d954a1f19eb206eff669d563cf558b"}, "98effcc6-a140-4126-935e-2a5f1150ce2c": {"doc_hash": "538cfd0f19e9b5bf6d9f4d7e48257ab63fd0f9291e72ebd3aa4a4e229aeeaf13"}, "23a87aaa-93fe-40e5-bd66-600e6a24d8cb": {"doc_hash": "b81948e026b62bdc05f6f70151b1b872072dc9fab9ae4dcb0ebc2884bf0ec422"}, "6505eda7-0f1d-47b2-a143-31e31e6775a7": {"doc_hash": "7d3bcd4bd25f3e7081c73941522ca415c2fdb1f4852d7416e8bcedd2888d52ef"}, "a1470842-4cbd-45fb-9e71-6004b63c1ff0": {"doc_hash": "84b698521f6491736ea450ff0b148d241943c2e065b29c3852a505ed84e18949"}, "93bc0650-55fc-4fe0-bc1c-0156df9181d9": {"doc_hash": "171913c1fc70cf5de627b950916d1e623bf4249ab8ded309b8ac42cdf843c80b"}, "de675a68-a1ae-4ede-8fc4-433c43bc3949": {"doc_hash": "d0aad9719cbb3329330e528f086eb4c51ac6142d07e68acbd76d4fd1031c7b53"}, "687ab5b3-dae8-4f13-925b-53e6cedc4206": {"doc_hash": "37ad23dbafc8f2f078feace4cde784e98e0df4b042e61386ee79b592c201e2ab"}, "92857046-d0f4-4b99-b507-f7c29b708f3a": {"doc_hash": "2b43bcd4c8a26d5558e3e4a31bd6d0fb72cdb5a817149bd04625563f93253144"}, "47efd296-7683-40a4-b25a-0289c44a5574": {"doc_hash": "96a7043686c8a84e166b117ae30a8fbb08a91f90ddd88a8a36ffc671f6a2e2aa"}, "de10bc30-2086-42c2-8663-02927d06037a": {"doc_hash": "1dd8e58d7bf084794a4a78f3d7ac553dbf686d71d6a475a6d26699f7a591e9e1"}, "d0e5740c-4900-4891-a9cc-bb46207b9572": {"doc_hash": "63cc393b390948b281b87e11897fd4761510166e9e0c6781e229ebb0f95d56be"}, "d77c7ec8-96d9-42db-b75d-2d25de2fcb54": {"doc_hash": "f137f876b594cd0d5bb52f13414afdd169c073e79547aad526e5018522de5435"}, "12f56622-0ed8-49d2-bab5-c99d64e16074": {"doc_hash": "96d1156723b5cdd7c223eef9aa918a6466fe3467400bcd95dea979036384531b"}, "136c588d-0495-4e31-9fa8-02cce5cfce6c": {"doc_hash": "76fef3fd33d7e42c0a6711425a9c74f886973f2b4d61cca832fe304692506889"}, "bb99cc00-3055-40dd-92d4-ccc7ea673c81": {"doc_hash": "e97fb900f92f05fd5411603522e10db3fd5e56fdb63464e8aa2ffe36f0b20119"}, "a0c4883f-7f61-4da0-a93c-50fcf7696204": {"doc_hash": "bd1d623344cd65ac9a223370f034be46decb44b2d6d0a4309923de2449abc2d7"}, "d720fa18-c4f3-400b-b11c-e832487384d0": {"doc_hash": "9367f386931dc96d7a0cfe06a7671c3e9fe508e61ac69d234337a83b264d4cd9"}, "6e92df35-22f0-4985-8c5d-ef8578bd2403": {"doc_hash": "96687c93ce75a3655e4d2d9c7f3fc94fef8409961dd9246d17554e93c980f3ce"}, "ee78622d-89bb-4115-83f1-b1eabfa07ae5": {"doc_hash": "7686d80d07e196da30640a71f79ab4bf3103fe29b3c2d880c05140a1fc4a531b"}, "357521f8-eb0f-4f06-ad73-46b1840dee14": {"doc_hash": "97cd16b301f99aedeb083bfaa075cdfa813d0b00158ae3a4daa234aa8ac0a52a"}, "b9727fcf-83c0-41cb-904e-f3eb541f2fd4": {"doc_hash": "fd2a9c5dd54166423ebdf77352ebfa465ae83d978f5986e7d57d90b79806246e"}, "3def8f76-66c9-41a3-a089-029020d79bda": {"doc_hash": "44f1301c6a6a3a1a660c030404c8d425446b1cea761ca5e8883fbc54d2b4105b"}, "28457a4f-6ee7-4c36-a0cd-c5dac8f7dfe2": {"doc_hash": "51097e8ab461176efb4ce49ad683620a6b8aa66b0e92fa613e95d7d8efc7e819"}, "bc5fd7d0-0513-4081-997c-ba1caf5bf0fa": {"doc_hash": "fa91a6275271ffab85dfb06a68b48b1467db688076371282e3848a163115cfce"}, "88d737af-4fd7-4ba2-be27-d4920dcec747": {"doc_hash": "a13797d2ff96b1575ec9b7153a678ad5e71fb01714183c3a044ebb1abbae5038"}, "6b44fc9f-a7a9-4ea8-a48b-fd72b9174395": {"doc_hash": "f3f23130b46f4f279558d758b09c719e02268f0e4be84668c0fcb79a68c00c96"}, "149e8c2b-c1b2-48fb-9a1d-fc5db1cb3157": {"doc_hash": "b2502b6042c6d717fc4020fc486c8619a81b59828b81acfa7711d1abe8d3cc5a"}, "b706000e-4708-4f0d-8619-f1e940fde5af": {"doc_hash": "3e98dceb91772a9929a4803c4d199f825e6480afdd451423f1c58deef0c2a439"}, "475397c1-4c08-4bd1-b910-98352092f09d": {"doc_hash": "651caa6577e4ad1cddb10b09272f0b1ae42bbc57db1f2c41e0a21e506dbefbef"}, "2888e4b8-38b8-4093-ac3f-d8ea476d379f": {"doc_hash": "86700da9c6d7640f4ca87b1308932cf3c07f0a3bd7fdff83c0eed7ff6adf4312"}, "1ad4381f-43ef-4452-88e7-f4af066283d7": {"doc_hash": "5de87a00c99bc84df8ba5da6a58ee4ac4b77b7793db1f8e4d7c744f58ee03727"}, "7322738e-0bef-4377-9dce-6b1e3955ad1f": {"doc_hash": "f51e2006117888e6f8a696c820e01c1ea1295911e09ec8421950b3e1568dceac"}, "9d28b341-1fd6-446e-a849-dd0a54593ced": {"doc_hash": "0bf2f3921247d98638fa7896b14a96b897bd5e0df5c019e7e613cb43defeb85b"}, "c4236f4b-aa15-4540-9680-b40477a0a139": {"doc_hash": "3e765eb3381411a9f43a1b5869c3f4da27bddd7ca22cc793d6e34ca50491ebb6"}, "baaaef3c-1c79-4e88-a1d4-178c9e53f773": {"doc_hash": "1b32fee49fc300a8d3111bb89f9fc36af53ca765e09e794d6dc6a964c2724663"}, "8f898e53-daaf-41b6-a319-7adf41fb7b8e": {"doc_hash": "11334c7f2c97bf0f1b15198911b9bf20b4cd4af37c261973fce7ce20ce3b45c3"}, "62488ccd-7bb3-4b5e-9b7f-66ceb3073cf9": {"doc_hash": "775b68f051585ec4ed7abfb0d27fbb339babc2d60729f52662d083e300399b62"}, "39f878a5-cccc-449c-9f57-9546a2b34705": {"doc_hash": "5b6425bf4329b0fc36a2e2591f0ddf49cc4325f43f75ab68e3f70a3414c32a38"}, "21c1c138-1807-4e6f-9f73-ae4166f1a5fd": {"doc_hash": "02ddaed25ca5ce809b40d7daae24e3a319da6efa11ead06888d615ec3bf929a3"}, "93348fe5-9a2f-4a5e-9932-1bef47778c75": {"doc_hash": "9974fb382b1479fc14031e3d2b25fd413201a53b58157b486ac6b7c1298cb7f4"}, "2e8337b9-dd16-4a80-a467-f638c560d4fe": {"doc_hash": "2e790c6da4246a13f0afedf836bb8afdba7dce3c1ec54af952609a05bfb9e220"}, "78c8e36a-7a5d-4eff-8ff0-bce6a37eefa2": {"doc_hash": "1e204d5ede50521fc5106de5196af5c50626cec0b99ca2dac7330278c9aacbdc"}, "6acbcf9d-f76f-4c7d-8373-d88059f7c7c4": {"doc_hash": "4e0b8bf2ea311b7f47acf8a245d8b96a2967547d26bee437c20e119c0bd89a00"}, "9af98ce5-97de-4a99-a2cf-c75acc86eeee": {"doc_hash": "40b17e5182ecbb18d365166fc9f41cf9b8968cb8c1533eb0e4784da1ac023ddc"}, "58f3bff3-d0d5-4f4a-97b5-f22178960d2a": {"doc_hash": "d177d63a11b2cc663ed0744afa44155b1ad69d2b00e2dbca1fc57bb069db4789"}, "38e69b9e-6861-4950-b716-d001d41443e9": {"doc_hash": "ffbea6393770a35155a047fff610390454365f16cb50558d09eda42825116660"}, "a87953c8-dd45-4a8d-bf19-7b72c498b9d5": {"doc_hash": "05fb346a2c2c1909f9f210d91790405ca5cd461a1182156c0e4f8019d344195e"}, "24aabd37-8341-4cb0-9831-6d083cbc8591": {"doc_hash": "0f7f1b154ef597999bb3b7114edf358432b4a5aa03ec763413b1f707f20d4072"}, "9dc52486-2238-4c12-bbba-6097b146b956": {"doc_hash": "3f9093c3669b978c3ac26d592e5100ce4871093dfd5375619153ebdcca128e74"}, "69a9221a-3e72-4b28-bd3c-93c44c134794": {"doc_hash": "750a88d9a2d82b9359feba9e75280b8f2396ce31c3322ce014ecf9dc4bbab5be"}, "1a2a9bd3-de45-400c-8ee5-b8d8377eb8bd": {"doc_hash": "27e6bca4ba8909349a0144eec8d226a896333be9e8bae99476f4d2a31139cdb6"}, "a0220bf7-7e3d-4364-9ed7-f8f7d74000d8": {"doc_hash": "08c62f4fb27e7af16048685d7acdade93bafbec8cff346d4afbbbf1ea3978499"}, "a4d12606-d0a2-4bc5-9d1b-806fd23650e5": {"doc_hash": "07c68b32542eac7f0dc220a5d9665f491e1e9d9b4ad21993c736012aa444d30d"}, "232b1f5a-65f0-4d62-8b30-4400fd6e026b": {"doc_hash": "7dae33834de644a5cfbcb7a7c7161bcbae3bacf0f1c32ba811836339b683a7eb"}, "3703639c-2f7b-4e1a-8580-fe44bd0c3061": {"doc_hash": "539f7bba8f100fe41c92b4ee8e4195664b96fef2cf98bc64bd11405b209b7600"}, "9ec0d2c0-d4d4-4d02-9cdb-293eeaaf4e53": {"doc_hash": "af70c67dfa8c3ff5f54ccc992d22cc62f640ce936d372d978c9c8ef13a262af3"}, "60646b7f-bc76-4062-996a-b313b600fb71": {"doc_hash": "49c3247e91080a750e5584948d1210dc1c5faf27f1660580e11ac458ff4981a8"}, "8727a5c1-1679-4734-ac27-0aa0eb127c94": {"doc_hash": "00b77e3dcffe2b078fa79942925ad034f539b9d4ea127b547e1f2d89eb9e25fd"}, "71a9a067-21f8-4c98-b3af-1faed64fcaae": {"doc_hash": "ba3b6370a9862e4ceb98a30fcc9408d04490985453892bc11479de5fad993fd8"}, "13cb69d2-b0f9-4694-a116-1b5949a40b38": {"doc_hash": "03fb4dcde611cf5dc2283967b2e03d5f50ee2f9f01d5a81b7fec1dfb739992d7"}, "880c70bd-2a15-465f-878b-a8f1b53fdafc": {"doc_hash": "7f9c867f2b2d613795384bf88f2f68bbc665c1c8f5a70ec325541e2f5673c03c"}, "564fe347-2515-4fcb-a646-0d124eb5c6bf": {"doc_hash": "2fff6d829ced6765ccc4897f72c867c668dc9e5967cd9ddf6af2127ab045f760"}, "08f45876-e4b2-45bc-af5a-e2d6d7798ab6": {"doc_hash": "a0c37e4201a97cca7e241b8c64dff2a1e58d8378b862c1e9702d0e972644ff91"}, "44cb4582-503f-4660-aaae-44e7165d483d": {"doc_hash": "532cca97cc8e076c9c759409134eb6a95410037c3a8d6db1467b705769b54227"}, "08a7e067-a6b7-45a0-8132-4689a6bb3b33": {"doc_hash": "6666df553218d9b319b1f91055d845ee003f60dc178314f0a39a1f16a3e2e363"}, "595e6de2-1c6b-4393-8128-b6448b1b56d0": {"doc_hash": "f18026a8d1ba14ada0045dee40d4f23d0532d27e5ddb734a132fd360e60d5f43"}, "0a85a178-f894-4802-bdb2-d68169832fb9": {"doc_hash": "eb9f96bcca3978a83225446c9b5df77ee1a062a1ae71f7b518eb9f1fec2c8652"}, "cda93fbd-33f4-41f4-bf3e-9af0d813a1db": {"doc_hash": "9a93b42c1c2a58cab91a45448421c1ec6d8a9946cc8bbd2baa26782add179bbe"}, "18f4079e-64c7-45df-ab84-4f189a2a47ae": {"doc_hash": "d425cb0d7624834f354067a52cbe437555d1a835088fa6b80eb6edcea442f34f"}, "e37d3e5f-7696-4cc0-9456-0d19c2be245c": {"doc_hash": "f5a120d1dd5d5a07c4125cf90b7f1f3f0625a8f7a37dd21de3116a15f44e7599"}, "9f191592-ba27-463d-a7d5-630d7a58cbaf": {"doc_hash": "aec54bf98b1c48002a329a543d17e0c487a55341b32e9487d3e6ea3c0fbca762"}, "95cca840-7166-42fe-99be-2306b0c23ac9": {"doc_hash": "691fdb30d374b46043e32de21a982fcc8cf124d0efdc81d6b8e977ed4383f494"}, "1eefa34c-688d-48ab-ba52-d656c2872f6a": {"doc_hash": "1023ea4b3671ed14da4e039ae8850de0626d97ae7382d7423adfd9da339f749d"}, "80083296-5afe-41f0-ada1-5e0f2b65caf5": {"doc_hash": "7942310f5ba4e7162f43ced66642283349348fcc1224665cbddacf99ee415c11"}, "0cd47f07-f7cb-4abf-b49d-469ee670c849": {"doc_hash": "21ad2c9a0ba2d61c4c3569c522bf26e7c20ba194a811378e2bf39a12c18e02ee"}, "701e3b79-69f7-49f5-8442-dc73f629780e": {"doc_hash": "af6750198e57d285b42f3582e9dd50341647eba6317a9e6dce7eb35e1ced3071"}, "a7f4646f-2a60-4731-865b-93929dafd19b": {"doc_hash": "8fce4410ff7a562040e8ec998c1709423e1ec42c266d4f0a077bfc42ceff336f"}, "99089c0f-ea61-41a6-9dd7-df98b3d0b03e": {"doc_hash": "c6a2a3482ea4649529eb2da34c9e0776b4812075befa638cf43ee82bb0c03444"}, "3c8beb05-94c5-48fe-ac76-1a130a4c894d": {"doc_hash": "891133153f158d391952759a43a1c0650b9992e11c85ff3779c2d46062ec90a5"}, "02677a0f-27fe-4b14-9c1b-536294b7363f": {"doc_hash": "2c55c36bf011ae32bb833fa16eaaf54758b0e3d364fb45991b9bd895d3de0231"}, "1752179a-dfc4-402a-afcd-720710470f36": {"doc_hash": "7923dd31001f9b74b979ca44841273d2e14b36d1063379b3454d7aace26f7061"}, "c211e7cf-ab76-46f0-83d1-903c0cba2c53": {"doc_hash": "58ea487a16851a2d8ff166678be68a6aa5eda946c0a73c1818dd4fc822ef2f8a"}, "080c3d72-2eac-4732-bbfa-bb174a853d20": {"doc_hash": "116b01fe36b139d6f209aac225d7745825754d5d69f1ecc15a248fe1de62625f"}, "4b4b0343-0672-4905-bc5c-aa64b80b2052": {"doc_hash": "e07ac345a42e7093caa833a8148d1eaab478c79f2415b8edc3db4d5316f98ae5"}, "a7b8fdb5-4556-4ef4-879f-cb1b5c6ef740": {"doc_hash": "3c49a1dec6251c6fbbff52476c3aa02f4df629f1ab20bb1b7844e4f5573be5c3"}, "64488cf0-301f-4740-b07b-bc2fb6a0c61d": {"doc_hash": "6b96fdb61ba7c363b725a53b6fd5e827dbb86d0ed1a7fe100a0d7374090b69c8"}, "4c41cb0b-84ea-4c50-9eb8-a25df47aedc2": {"doc_hash": "a889a62da5aa65e0a4cabeda0a2228fdceead0ce68eac6873b97af91828fc380"}, "025167d9-0702-45ef-b606-20ec0eb0a2cd": {"doc_hash": "b4e3cb8336c16f4c3a03960e58b9ed1ec7ea4d82353c5be142ee0b89b7a76b17"}, "3425d737-af39-45d0-8f04-2b72dde4a339": {"doc_hash": "774038a87c598bcf657a7c768bfac52f9feb04fca26a4c0981f2bd0737d0ba7a"}, "64353ddd-3c95-4692-bd21-5bb640eadba2": {"doc_hash": "2d7f2fda11fac868d92c3b66b6a951922eda6ba956ca47498cef633337568f43"}, "f7f429a3-ab04-4aa0-8e90-988065bb000b": {"doc_hash": "0fa551a3c21f5cbf8e8223abc43c4b9fd45cf991381d697439a3347c72ec5917"}, "e2da286f-7610-4983-824d-17f34d24f0db": {"doc_hash": "802af3a9e64157eda4994dff4ece4ae0857b3d66999fecc43340f08588eb7b94"}, "153dbd8b-c3db-440c-94ab-dc317da7c899": {"doc_hash": "60c43679017d7ba437e31b0dfb631bd04135f558d4c810bdac09e623b06d51d1"}, "dd431761-d388-47ef-a8e9-effeb06ba30a": {"doc_hash": "9b7defe239919bcaa267dfb289c75e18cc8ed833afce9846f71444fc688bd118"}, "bda71a77-2b0c-48e2-b369-fbfa08fd367e": {"doc_hash": "0b61fb730d59d0c78881d6ca75e2546b2301a5cd30d5ab7070d8ea4058bdd458"}, "4c86b6e3-3ff5-471b-b53c-d1cdd4429c1c": {"doc_hash": "0afbd54be9ed6754d9cd816482e8821db5e1d5760a42c5c3af6e2089f2c27b62"}, "32fc2986-e25c-49ec-bbc9-521b901c5732": {"doc_hash": "e4b03c2691351e8f6e55f7836b9c079e40ea6fd28666ba14eba0df5dfd9f9a66"}, "41c60b4d-8b61-45e3-a9ac-b76f19cc786b": {"doc_hash": "c20710fa8d416a92064d32b78d479afa18d1d173c98bfd207b96988265506a68"}, "4dee3614-132d-4f05-9e97-ff9c83e04950": {"doc_hash": "2d5bc696361a28033e96482c501fc278f64409e65d5c76544e4973730326437c"}, "cd998e2b-0c48-46ba-b8b0-226a27b673cb": {"doc_hash": "0fa9c3ea38cea318c0c26ee3aa40bd9cacea4e5a08172443af4b104852f1da44"}, "abdbd229-f0f6-44a8-8533-050fd3402e54": {"doc_hash": "5c83a5861e12d51b6a727b234bc279bc3a61e4a98b8d4088d86c7fbdc796a560"}, "cc6d2e9e-5ab4-472d-94cc-ddee579bd55e": {"doc_hash": "b218e3d1b8cc4ebb5415288b8763f022ceb7f8bf42c154bebbb553194a03a3ef"}, "c271f2ef-db22-4c9e-8a1d-7b286c5e1512": {"doc_hash": "4943e140d69943ebbfad241d7e96987ab71be4a25a8fd9ad2d26049d6aafdfa6"}, "7e79e502-e90a-49ef-a6a1-0c1e7d935378": {"doc_hash": "49b0d730343e281c55a358177be4f23e7ae4e27c23f2be50e46248bdd7721123"}}}