Spaces:
Build error
Build error
Upload 15 files
Browse files- banking.txt +3 -0
- certifications.txt +31 -0
- deeplearning.txt +1 -0
- domain-banking.txt +5 -0
- education.txt +1 -0
- imple.txt +1 -0
- personal.txt +6 -0
- presales.txt +1 -0
- publications.txt +9 -0
- requirements-eng.txt +1 -0
- requirements.txt +10 -0
- resume_app_6_v3_final.py +342 -0
- summary.txt +3 -0
- testing.txt +1 -0
- work experience.txt +2 -0
banking.txt
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
work profile - Commercial Banking:
|
| 2 |
+
|
| 3 |
+
I gained experience in various retail, credit, and operations desks within commercial banking. My retail banking experience encompasses branch banking operations such as deposit mobilization, clearing, back-office and front-office operations, customer management, and preparation of retail and profitability reports. As part of the wholesale corporate banking unit, I developed expertise in relationship banking, credit appraisal, and the preparation of credit proposals. I also have extensive experience in credit operations, including sanctioning, disbursal, documentation, charge creation, and handling legal aspects of corporate lending. I have handled various credit products, such as working capital facilities, export financing products, and letters of credit, and participated in EXIM Bank related activities and branch treasury operations. My credit operations experience includes monitoring credit accounts, post-sanction follow-up, proposal reviews, RBI reporting, ensuring prudential norms for NPAs, coordinating internal and regulatory audits, and conducting quality inspections. As Head of Credit at the Madurai Branch of Global Trust Bank, I met with numerous corporate clients for credit mobilization, fostered long-term client relationships, sourced export finance opportunities from textile units around Madurai, and collaborated with the branch manager to generate additional business and achieve profitability and growth targets.
|
certifications.txt
ADDED
|
@@ -0,0 +1,31 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
certifications
|
| 2 |
+
|
| 3 |
+
I hold various certifications in different areas, including:
|
| 4 |
+
|
| 5 |
+
* **Generative AI:** Generative AI with Large Language Models (Coursera, March 2024 - Present)
|
| 6 |
+
* **Data Science:** Practical Data Science With AWS Cloud (Coursera, November 2023 - Present)
|
| 7 |
+
* **AI:** AI for Medicine (Coursera, October 2023 - Present)
|
| 8 |
+
* **Natural Language Processing:** Natural Language Processing Specialization (Coursera, April 2023 - Present)
|
| 9 |
+
* **TensorFlow:** TensorFlow: Advanced Techniques (Coursera, April 2023 - Present)
|
| 10 |
+
* **Deep Learning:** Deep Learning Specialization (Coursera, September 2022 - Present)
|
| 11 |
+
* **Machine Learning:** Machine Learning Specialization (Coursera, August 2022 - Present)
|
| 12 |
+
* **Agile Project Management:** Agile Project Management (Coursera and LinkedIn, July 2021 - Present)
|
| 13 |
+
* **Agile Scrum:** Agile Scrum Master (Simplilearn, July 2021 - Present)
|
| 14 |
+
* **DevOps:** DevOps Certification Training (Simplilearn, July 2021 - Present)
|
| 15 |
+
* **Blockchain:** Introduction to Supply Chain Finance & Blockchain Technology (New York Institute of Finance, July 2021 - Present)
|
| 16 |
+
* **Google Cloud:** Google Cloud Certifications (Google Cloud, July 2021 - Present)
|
| 17 |
+
* **Kubernetes:** Architecting with Google Kubernetes Engine: Foundations (Google Cloud Training, June 2021 - Present)
|
| 18 |
+
* **Investment Operations:** IT in Investment Operations (Securities & Investment Institute, London, April 2009 - Present)
|
| 19 |
+
* **Securities Operations:** Global Securities Operations (Securities & Investment Institute, London, March 2009 - Present)
|
| 20 |
+
* **Six Sigma:** Six Sigma Green Belt Professional (IIPM, Chennai, November 2007 - Present)
|
| 21 |
+
* **Java:** Advanced Java and Web Technologies (SSI, April 2003 - Present)
|
| 22 |
+
* **Networking:** Network+ (Digiterati, Chennai, April 2003 - Present)
|
| 23 |
+
* **Oracle:** Oracle 9i with Visual Basic (SSI, February 2003 - Present)
|
| 24 |
+
* **Software Quality:** Software Quality Professional (CSQP) (STQC IT Services, ETDC, Chennai, December 2001 - Present)
|
| 25 |
+
* **Cloud Architecture:** AWS Certified Solution Architect - Associate (AWS, November 2021 - November 2024)
|
| 26 |
+
* **Debt Market:** FIMMDA-NSE Debt Market (Basic) Module (NSE, Mumbai, September 2007 - September 2012)
|
| 27 |
+
* **Capital Market:** Capital Market (Dealers) Module (NSE, Mumbai, August 2007 - August 2012)
|
| 28 |
+
* **Depository Operations:** NSDL Depository Operations (NSE, Mumbai, September 2007 - August 2012)
|
| 29 |
+
* **Project Management:** PMP (PMI, June 2007 - June 2011)
|
| 30 |
+
|
| 31 |
+
This comprehensive list of certifications demonstrates my commitment to continuous learning and staying current with the latest technologies and methodologies in my field.
|
deeplearning.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
deep learning expertise - My expertise in data science and deep learning is demonstrated through a range of certifications from DeepLearning.ai and Coursera, including specializations in Generative AI with Large Language Models, Deep Learning, Machine Learning, TensorFlow Advanced Techniques, Natural Language Processing, and AI for Medicine. I am also a certified AWS Associate Solution Architect and certified on "practical data science skills with AWS". I actively participate in Kaggle competitions, earning the titles of Notebook Expert and Discussions Expert. My Kaggle profile showcases my contributions: [https://www.kaggle.com/murugesann](https://www.kaggle.com/murugesann). Additionally, I contribute to the data science community on GitHub at [https://github.com/nmuru](https://github.com/nmuru). I frequently share insights and knowledge through articles on my LinkedIn profile: [https://www.linkedin.com/in/murugesan-n/](https://www.linkedin.com/in/murugesan-n/).
|
domain-banking.txt
ADDED
|
@@ -0,0 +1,5 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Work Profile - Domain Expertise in Financial Applications:
|
| 2 |
+
|
| 3 |
+
I possess in-depth knowledge of core banking products, with proficiency in leading solutions like Infosys' Finacle and Oracle's Flexcube. My experience extends to working with in-house banking applications at multinational banks like Standard Chartered. I have expert-level understanding of various modules, including retail banking, corporate banking, FX, money market, and derivatives, as well as their integration with external applications such as payment systems.
|
| 4 |
+
|
| 5 |
+
Additionally, I am a domain expert in securities operations, possessing knowledge and certification in global securities and investment operations. This includes expertise in areas such as securities trade life cycle, trade & settlement matching, global custodial services, settlement & clearing operations, asset servicing, and other front, middle, and back-office operations.
|
education.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
education - I completed my CFA (ICFAI) through distance learning at ICFAI (Tripura) University from October 2009 to July 2011, achieving a score of 79.08%. I earned my MBA in Finance & Systems from Bharathidasan Institute of Management (BIM), Trichy, with a GPA of 3.63/4.00, studying on-campus from June 1996 to May 1998. I hold a BE in Mechanical Engineering from College of Engineering, Guindy (CEG), Anna University, Chennai, which I completed on-campus from June 1990 to May 1994, graduating with a 78.74%. I finished my Higher Secondary education at Bishop Heber Hr Sec School, Trichy, with an 89.25% from June 1988 to May 1990. Prior to that, I completed my SSLC at GHSS, Lalgudi, with an 88.60% from June 1987 to May 1988.
|
imple.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
Work Profile - Implementation consulting - At Infosys' Mangalore DC, I reviewed the functional and requirement specifications for manufacturing modules, specifically the Bill of Material, for the Delmonte account. At Infosys' Bangalore DC, as part of the Sony Real Time Advanced Supply Chain and Demand Planning System Implementation and Maintenance Team, I was involved in functional consulting for production support, maintenance, enhancements, and defect fixing. I also interfaced with the design and onsite teams, monitored the complex multi-plant SCM application, and maintained and enhanced the application. Later, I transitioned to Infosys' Banking Business Unit, briefly working with the Data Migration Team for Oriental Bank of Commerce's Finacle Implementation in Delhi.
|
personal.txt
ADDED
|
@@ -0,0 +1,6 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Personal
|
| 2 |
+
Date of Birth / Age / Sex : 11.01.1973 / 51 yrs / M
|
| 3 |
+
Marital Status : Single (Divorcee)
|
| 4 |
+
Present Address : 89/3, Vasantha Nagar, Lalgudi-621601, Trichy, TN, India
|
| 5 |
+
Contact Detail : Email: nmurugs@gmail.com, n_murugs@hotmail.com
|
| 6 |
+
Mobile : 91-6374555193
|
presales.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
Work Profile - Pre-sales: As a Pre-Sales Manager at Iflex Solutions (now Oracle), I responded to numerous RFIs/RFPs concerning Flexcube's retail and corporate modules for international clients. This involved configuring the product for client demos, facilitating pre-sales demos and product walkthroughs, performing gap analysis, and coordinating with implementation and delivery teams to close deals. I was also part of the implementation release testing team, testing Flexcube's Loans, Deposit, and Money Market modules. Major international pre-sales assignments included demos and product walkthroughs for Flexcube Corporate Suite at Bank of East Asia in London, Securities and Money Market modules at ICBC in Nigeria, and the Commercial Lending module for CTCB in Taiwan.
|
publications.txt
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
publications - I have published several academic papers, demonstrating my research and expertise in finance and project management. My publications include:
|
| 2 |
+
|
| 3 |
+
* "When Equity Returns Are Different from Discount Rates: Why PEG Based DCF Valuation is Flawed" in the SSRN e-Journal.
|
| 4 |
+
* "Valuation of Levered vs. Unlevered Firm: The Impact of Debt on Growth Opportunities – Why the APV Equation Could be Misleading" in the SSRN e-Journal: Corporate Finance: Capital Structure & Payout Policies eJournal.
|
| 5 |
+
* "CAPM and DCF: Does SML Relationship Gives Cost of Equity? – Ex-Ante Stock Returns are Not Same as Implied Discount Rates" in the SSRN e-Journal: Econometric Modelling: Capital Markets - Asset Pricing eJournal.
|
| 6 |
+
* "Reinvestment Assumptions Inherent in DCF Methodologies: The Concept of Reinvestment IRR" in the SSRN e-Journals.
|
| 7 |
+
* "Validity of CAPM: Security Market Line (SML) Can Never Predict Required Rate of Return for Equity Even If the Markets are Efficient - A Simple Intuitive Explanation" in the Zenith International Journal of Multidisciplinary Research (2013) and the SSRN e-Journal: Capital Markets: Market Efficiency eJournal.
|
| 8 |
+
* "Company Valuation: A Comparison of Three DCF Models (MM (1958), Fernandez (2007), Hamada (1972)) and Why MM's Corrected Equation for Cost of Equity is Wrong" in the SSRN e-Journal: Capital Markets: Asset Pricing & Valuation eJournal.
|
| 9 |
+
* "Software Project Management Framework - Integrating software engineering and project management principles" in the Conference proceedings of Project Management Leadership Seminar – 2008 of QA.
|
requirements-eng.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
Work Profile - Requirements Engineering / Functional Consulting / Business Analysis - At Ramco Systems, I led the requirements engineering for the Credit/Debit Note component of the Accounts Payable module in their ERP application, Ramco e.Apps. This involved preparing linear process chains, defining features, and developing detailed requirements specifications. I also designed the user interface for an internet-based architecture using Dream Weaver, mapped requirements using the Ramco Virtual Works framework, and developed test cases. Additionally, I coordinated with the design team to ensure seamless development and testing of the component.Furthermore, I served as a functional consultant for Ramco Systems' first external project – developing enterprise-level courier operation software for AFL Wiz Corporation. In this role, I analyzed and developed "as-is" and "to-be" linear process chains for the Order to Cash process, created a prototype using UML modeling, and collaborated with the customer and onsite team to gather requirements. I also completed the detailed requirements specification for the module, developed test cases, and coordinated with the design team for development and testing.
|
requirements.txt
ADDED
|
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
langchain
|
| 2 |
+
langchain-groq
|
| 3 |
+
sentence-transformers
|
| 4 |
+
langchainhub
|
| 5 |
+
faiss-cpu
|
| 6 |
+
gradio
|
| 7 |
+
gradio_client
|
| 8 |
+
crewai==0.28.8
|
| 9 |
+
crewai_tools==0.1.6
|
| 10 |
+
langchain_community==0.0.29
|
resume_app_6_v3_final.py
ADDED
|
@@ -0,0 +1,342 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# -*- coding: utf-8 -*-
|
| 2 |
+
"""Resume-App-6-v1-Final.ipynb
|
| 3 |
+
|
| 4 |
+
Automatically generated by Colab.
|
| 5 |
+
|
| 6 |
+
Original file is located at
|
| 7 |
+
https://colab.research.google.com/drive/17tlCu0ZFxFJHLoFrfIbkq7JjCdexUeu6
|
| 8 |
+
"""
|
| 9 |
+
|
| 10 |
+
# !pip install langchain langchain-groq sentence-transformers langchainhub faiss-cpu gradio gradio_client
|
| 11 |
+
|
| 12 |
+
# !pip install crewai==0.28.8 crewai_tools==0.1.6 langchain_community==0.0.29
|
| 13 |
+
|
| 14 |
+
from crewai import Agent, Task, Crew
|
| 15 |
+
|
| 16 |
+
import os
|
| 17 |
+
import json
|
| 18 |
+
|
| 19 |
+
|
| 20 |
+
|
| 21 |
+
# import langchain libraries
|
| 22 |
+
# !pip install langchain langchain-groq langchainhub duckduckgo-search
|
| 23 |
+
from langchain.agents import AgentExecutor
|
| 24 |
+
from langchain.agents import create_react_agent
|
| 25 |
+
from langchain.agents import create_structured_chat_agent
|
| 26 |
+
from langchain import hub
|
| 27 |
+
from langchain_groq import ChatGroq
|
| 28 |
+
from langchain_core.prompts import ChatPromptTemplate
|
| 29 |
+
from langchain.agents import Tool
|
| 30 |
+
from langchain_community.tools import DuckDuckGoSearchResults
|
| 31 |
+
from langchain.schema.output_parser import StrOutputParser
|
| 32 |
+
from langchain_core.prompts import PromptTemplate
|
| 33 |
+
from langchain_community.tools import DuckDuckGoSearchRun
|
| 34 |
+
from langchain.chains.combine_documents import create_stuff_documents_chain
|
| 35 |
+
from langchain.chains import create_retrieval_chain
|
| 36 |
+
from langchain import hub
|
| 37 |
+
from langchain.chains import RetrievalQA
|
| 38 |
+
from langchain_community.embeddings.sentence_transformer import SentenceTransformerEmbeddings
|
| 39 |
+
from langchain_community.document_loaders.csv_loader import CSVLoader
|
| 40 |
+
from langchain.tools import DuckDuckGoSearchRun
|
| 41 |
+
from langchain_core.output_parsers import JsonOutputParser
|
| 42 |
+
# from langchain.agents import AgentExecutor, create_tool_calling_agent
|
| 43 |
+
from langchain_core.prompts import ChatPromptTemplate
|
| 44 |
+
|
| 45 |
+
#import gradio libraries
|
| 46 |
+
# !pip install gradio gradio_client
|
| 47 |
+
import gradio as gr
|
| 48 |
+
|
| 49 |
+
|
| 50 |
+
#import vectorstore libraries
|
| 51 |
+
# !pip install faiss-cpu
|
| 52 |
+
from langchain_community.vectorstores import FAISS
|
| 53 |
+
embedding_function = SentenceTransformerEmbeddings(model_name="all-MiniLM-L6-v2")
|
| 54 |
+
|
| 55 |
+
# Define the LLM - We shall use ChatGroq of Groq Platform and LLama70B
|
| 56 |
+
# This llm definition is redundant as now models will be chosen by user
|
| 57 |
+
llm = ChatGroq(
|
| 58 |
+
api_key="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",
|
| 59 |
+
model="llama3-70b-8192",
|
| 60 |
+
# model = 'gemma-7b-it',
|
| 61 |
+
temperature = 0
|
| 62 |
+
# model = 'mixtral-8x7B-32768'
|
| 63 |
+
|
| 64 |
+
)
|
| 65 |
+
|
| 66 |
+
import os
|
| 67 |
+
from langchain.document_loaders import TextLoader
|
| 68 |
+
|
| 69 |
+
# folder_path = "https://huggingface.co/spaces/nmurugesh/My-Interview-Chatbot/blob/main/" # Replace with the actual path to your folder
|
| 70 |
+
|
| 71 |
+
file_names = ['banking.txt', 'certifications.txt','deeplearning.txt','domain-banking.txt',
|
| 72 |
+
'education.txt','imple.txt','personal.txt','presales.txt','publications.txt',
|
| 73 |
+
'summary.txt', 'requirements-eng.txt','testing.txt','work experience.txt']
|
| 74 |
+
|
| 75 |
+
documents = []
|
| 76 |
+
for filename in file_names:
|
| 77 |
+
if filename.endswith(".txt"):
|
| 78 |
+
loader = TextLoader(filename)
|
| 79 |
+
doc = loader.load()[0] # Load the single Document object for the file
|
| 80 |
+
documents.append(doc)
|
| 81 |
+
vectorstore1 = FAISS.from_documents(documents, embedding_function)
|
| 82 |
+
|
| 83 |
+
vectorstore1.save_local("vectorstore1")
|
| 84 |
+
|
| 85 |
+
retriever1 = vectorstore1.as_retriever(search_type='mmr',search_kwargs={"k": 10})
|
| 86 |
+
|
| 87 |
+
|
| 88 |
+
import os
|
| 89 |
+
|
| 90 |
+
os.environ["OPENAI_API_KEY"] = "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" # This is what Crew AI expects
|
| 91 |
+
|
| 92 |
+
# !pip install gpt4all
|
| 93 |
+
|
| 94 |
+
# Agent 1: Interview candidate
|
| 95 |
+
Interview_candidate = Agent(
|
| 96 |
+
llm = llm,
|
| 97 |
+
role="Interview candidate who gives final answers",
|
| 98 |
+
goal='''You are currently attending an interview. \
|
| 99 |
+
The name of the company that is interviewing is {company}. The position for which interview is conducted is {position}. \
|
| 100 |
+
Your objective is to ace the interview and get the job based on your qualifications and expertise''',
|
| 101 |
+
verbose=True,
|
| 102 |
+
memory = True,
|
| 103 |
+
backstory=(''' You are currently attending an interview. \
|
| 104 |
+
For all the questions asked, you should NOT ONLY answer from the context provided - \
|
| 105 |
+
BUT ALSO from the conversational history available to you. You can also use any conversational memory history stored as embeddings as part of the crew. \
|
| 106 |
+
Your answer should be confidently articulated, using professional tone and style, and concise and clear.'''),
|
| 107 |
+
|
| 108 |
+
)
|
| 109 |
+
|
| 110 |
+
|
| 111 |
+
|
| 112 |
+
# Task for Researcher Agent: Extract Job Requirements
|
| 113 |
+
Interview_answer_task = Task(
|
| 114 |
+
description=('''You are being interviewed by a company. The name of the company that is interviewing is {company}. The position for which interview is conducted is {position}. \
|
| 115 |
+
Note that you might have been asked questions before this one. \
|
| 116 |
+
You should answer based on NOT ONLY the current information context provided \
|
| 117 |
+
BUT ALSO based on previous questions, answers and context as part of the conversation. \
|
| 118 |
+
The current question is {question}. The current context is {context}. You should use any memory information you have already stored based on conversational history)'''
|
| 119 |
+
),
|
| 120 |
+
expected_output=("You are currently attending an interview.For all the questions asked, you SHOULD answer only from the context provided - \
|
| 121 |
+
that is data provided for this purpose. Your answer should be confidently articulated, using professional tone and style, and concise and clear."
|
| 122 |
+
),
|
| 123 |
+
agent=Interview_candidate )
|
| 124 |
+
|
| 125 |
+
interview_crew = Crew(
|
| 126 |
+
agents=[ Interview_candidate],
|
| 127 |
+
|
| 128 |
+
tasks=[ Interview_answer_task],
|
| 129 |
+
|
| 130 |
+
memory = True,
|
| 131 |
+
|
| 132 |
+
embedder={
|
| 133 |
+
"provider": "gpt4all"
|
| 134 |
+
},
|
| 135 |
+
|
| 136 |
+
verbose=True
|
| 137 |
+
)
|
| 138 |
+
|
| 139 |
+
from langchain.chains import create_history_aware_retriever, create_retrieval_chain
|
| 140 |
+
from langchain.chains.combine_documents import create_stuff_documents_chain
|
| 141 |
+
from langchain_community.chat_message_histories import ChatMessageHistory
|
| 142 |
+
from langchain_core.chat_history import BaseChatMessageHistory
|
| 143 |
+
from langchain_core.prompts import ChatPromptTemplate, MessagesPlaceholder
|
| 144 |
+
from langchain_core.runnables.history import RunnableWithMessageHistory
|
| 145 |
+
|
| 146 |
+
|
| 147 |
+
|
| 148 |
+
|
| 149 |
+
### Contextualize question ###
|
| 150 |
+
contextualize_q_system_prompt = (
|
| 151 |
+
"Given a chat history and the latest user question "
|
| 152 |
+
"which might reference context in the chat history, "
|
| 153 |
+
"formulate a standalone question which can be understood "
|
| 154 |
+
"without the chat history. Do NOT answer the question, "
|
| 155 |
+
"just reformulate it if needed and otherwise return it as is."
|
| 156 |
+
)
|
| 157 |
+
contextualize_q_prompt = ChatPromptTemplate.from_messages(
|
| 158 |
+
[
|
| 159 |
+
("system", contextualize_q_system_prompt),
|
| 160 |
+
MessagesPlaceholder("chat_history"),
|
| 161 |
+
("human", "{input}"),
|
| 162 |
+
]
|
| 163 |
+
)
|
| 164 |
+
history_aware_retriever = create_history_aware_retriever(
|
| 165 |
+
llm, retriever2, contextualize_q_prompt
|
| 166 |
+
)
|
| 167 |
+
|
| 168 |
+
|
| 169 |
+
### Answer question ###
|
| 170 |
+
system_prompt = (
|
| 171 |
+
"You are an assistant for question-answering tasks. "
|
| 172 |
+
"Use the following pieces of retrieved context to answer "
|
| 173 |
+
"the question. If you don't know the answer, say that you "
|
| 174 |
+
"don't know. Use three sentences maximum and keep the "
|
| 175 |
+
"answer concise."
|
| 176 |
+
"\n\n"
|
| 177 |
+
"{context}"
|
| 178 |
+
)
|
| 179 |
+
qa_prompt = ChatPromptTemplate.from_messages(
|
| 180 |
+
[
|
| 181 |
+
("system", system_prompt),
|
| 182 |
+
MessagesPlaceholder("chat_history"),
|
| 183 |
+
("human", "{input}"),
|
| 184 |
+
]
|
| 185 |
+
)
|
| 186 |
+
question_answer_chain = create_stuff_documents_chain(llm, qa_prompt)
|
| 187 |
+
|
| 188 |
+
rag_chain = create_retrieval_chain(history_aware_retriever, question_answer_chain)
|
| 189 |
+
|
| 190 |
+
|
| 191 |
+
### Statefully manage chat history ###
|
| 192 |
+
store = {}
|
| 193 |
+
|
| 194 |
+
|
| 195 |
+
def get_session_history(session_id: str) -> BaseChatMessageHistory:
|
| 196 |
+
if session_id not in store:
|
| 197 |
+
store[session_id] = ChatMessageHistory()
|
| 198 |
+
return store[session_id]
|
| 199 |
+
|
| 200 |
+
|
| 201 |
+
conversational_rag_chain = RunnableWithMessageHistory(
|
| 202 |
+
rag_chain,
|
| 203 |
+
get_session_history,
|
| 204 |
+
input_messages_key="input",
|
| 205 |
+
history_messages_key="chat_history",
|
| 206 |
+
output_messages_key="answer",
|
| 207 |
+
)
|
| 208 |
+
|
| 209 |
+
# conversational_rag_chain.invoke(
|
| 210 |
+
# {"input": "Hadf why sdfasdf iii?"},
|
| 211 |
+
# config={"configurable": {"session_id": "abc123"}},
|
| 212 |
+
# )["answer"]
|
| 213 |
+
|
| 214 |
+
# conversational_rag_chain.invoke(
|
| 215 |
+
# {"input": "Can you elaborate more on your previous answer?"},
|
| 216 |
+
# config={"configurable": {"session_id": "abc123"}},
|
| 217 |
+
# )["answer"]
|
| 218 |
+
|
| 219 |
+
from langchain_core.messages import AIMessage, HumanMessage
|
| 220 |
+
|
| 221 |
+
def get_conversation():
|
| 222 |
+
conv = []
|
| 223 |
+
for message in store["abc123"].messages:
|
| 224 |
+
|
| 225 |
+
if isinstance(message, AIMessage):
|
| 226 |
+
prefix = "AI"
|
| 227 |
+
else:
|
| 228 |
+
prefix = "User"
|
| 229 |
+
conv.append(f"{prefix}: {message.content}\n")
|
| 230 |
+
return "\n".join(conv)
|
| 231 |
+
|
| 232 |
+
# get_conversation()
|
| 233 |
+
|
| 234 |
+
def answer(input1,input2,input3):
|
| 235 |
+
question = input3
|
| 236 |
+
company = input1
|
| 237 |
+
position = input2
|
| 238 |
+
|
| 239 |
+
context = conversational_rag_chain.invoke(
|
| 240 |
+
{"input": question},
|
| 241 |
+
config={
|
| 242 |
+
"configurable": {"session_id": "abc123"}
|
| 243 |
+
}, # constructs a key "abc123" in `store`.
|
| 244 |
+
)["answer"]
|
| 245 |
+
|
| 246 |
+
|
| 247 |
+
|
| 248 |
+
|
| 249 |
+
# context = rag_chain.invoke(question)
|
| 250 |
+
# context = ['\n\n'.join(doc.page_content for doc in rag_chain.invoke(question))]
|
| 251 |
+
result =interview_crew.kickoff(inputs={"question":question,'context':context,'company':company,'position':position})
|
| 252 |
+
|
| 253 |
+
return result
|
| 254 |
+
|
| 255 |
+
# retriever2 = vectorstore1.as_retriever(search_type='mmr',search_kwargs={"k": 10})
|
| 256 |
+
def reset_store():
|
| 257 |
+
interview_crew = Crew(
|
| 258 |
+
agents=[ Interview_candidate],
|
| 259 |
+
|
| 260 |
+
tasks=[ Interview_answer_task],
|
| 261 |
+
|
| 262 |
+
memory = True,
|
| 263 |
+
|
| 264 |
+
embedder={
|
| 265 |
+
"provider": "gpt4all"
|
| 266 |
+
},
|
| 267 |
+
|
| 268 |
+
verbose=True
|
| 269 |
+
)
|
| 270 |
+
|
| 271 |
+
if 'abc123' in store:
|
| 272 |
+
store['abc123'].clear()
|
| 273 |
+
return store
|
| 274 |
+
|
| 275 |
+
|
| 276 |
+
from gradio import Image
|
| 277 |
+
|
| 278 |
+
with gr.Blocks() as demo:
|
| 279 |
+
reset_store()
|
| 280 |
+
|
| 281 |
+
# Add a Markdown block for the description
|
| 282 |
+
gr.Markdown("""<h1 style='color: blue;'>Interview Chatbot for N Murugesan</h1>""")
|
| 283 |
+
gr.Markdown("""Powered by CrewAI,Gradio, Groq, Llama3, FAISS, Langchain""")
|
| 284 |
+
|
| 285 |
+
gr.Markdown(
|
| 286 |
+
"""
|
| 287 |
+
|
| 288 |
+
<h2 style = 'color: blue;'>
|
| 289 |
+
This chatbot will answer interview questions on behalf of Murugesan Narayanaswamy! </h2>
|
| 290 |
+
"""
|
| 291 |
+
)
|
| 292 |
+
gr.Image("photo-recent.jpg", width=250)
|
| 293 |
+
gr.Markdown("""<h2 style='color: blue;'>Ask any HR Round Interview Questions - Factual Answers based on Resume!</h2>""")
|
| 294 |
+
# Use a Column to structure the inputs and outputs
|
| 295 |
+
with gr.Column():
|
| 296 |
+
# First text input and button
|
| 297 |
+
text_input1 = gr.Textbox(
|
| 298 |
+
label="Enter the Company Name!",
|
| 299 |
+
placeholder='''Enter the company name e.g. Cognizant Technologies ''',
|
| 300 |
+
value = 'Cognizant Technologies'
|
| 301 |
+
|
| 302 |
+
)
|
| 303 |
+
text_input2 = gr.Textbox(
|
| 304 |
+
label="Enter the Position you are interviewing for!",
|
| 305 |
+
placeholder='''Enter the position you are interviewing for e.g. Generative AI - Consultant''',
|
| 306 |
+
value = 'Generative AI - Consultant'
|
| 307 |
+
|
| 308 |
+
)
|
| 309 |
+
|
| 310 |
+
text_input3 = gr.Textbox(
|
| 311 |
+
label="Enter your question here!",
|
| 312 |
+
placeholder='''Ask your question; e.g., Tell something about yourself; Your career path has been diverse; \
|
| 313 |
+
Could you walk us through the key transitions and the motivations behind those changes?''',
|
| 314 |
+
value = 'Have you participated in any kaggle competitions?'
|
| 315 |
+
|
| 316 |
+
)
|
| 317 |
+
button1 = gr.Button("Answer!")
|
| 318 |
+
outputs1 = [
|
| 319 |
+
gr.Textbox(label="My Answer - you can verify with resume later!",show_copy_button=True)
|
| 320 |
+
]
|
| 321 |
+
|
| 322 |
+
|
| 323 |
+
button1.click(answer,inputs=[text_input1, text_input2,text_input3], outputs=outputs1)
|
| 324 |
+
|
| 325 |
+
|
| 326 |
+
gr.Markdown("""<h2 style='color: blue;'>Get Interview Transcript!</h2>""")
|
| 327 |
+
outputs2 = [
|
| 328 |
+
gr.Textbox(label="Interview Transcript!",show_copy_button=True)
|
| 329 |
+
]
|
| 330 |
+
button2 = gr.Button("Get Interview Transcript!")
|
| 331 |
+
button2.click(get_conversation, outputs=outputs2)
|
| 332 |
+
|
| 333 |
+
gr.Markdown("""<h2 style='color: blue;'>Reset the Interview!</h2>""")
|
| 334 |
+
button3 = gr.Button("Reset Interview!")
|
| 335 |
+
button3.click(reset_store)
|
| 336 |
+
|
| 337 |
+
|
| 338 |
+
# Launch the Gradio app
|
| 339 |
+
demo.launch()
|
| 340 |
+
|
| 341 |
+
demo.close()
|
| 342 |
+
|
summary.txt
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Resume Summary: I am an Engineer-MBA-CFA with over 15 years of experience in leading companies across engineering, banking, IT, and education. I consistently ranked within the top 5 in reputable educational institutions and have worked at companies like L&T, Global Trust Bank, Ramco Systems, Infosys, Wipro Technologies, and VIT Business School. After my MBA, I spent 2.5 years in commercial banking, rising to Head of Credit at the Madurai Branch. I then transitioned to the IT industry, spending 7+ years in techno-functional roles, working with ERP systems and core banking products. I also obtained PMP and Global Securities Operations certifications during this time. My IT experience includes requirements engineering, functional consulting, business analysis, implementation consulting, presales, and test management. As a presales manager, I responded to RFIs/RFPs, facilitated demos, and worked with teams in London, Nigeria, and Taiwan. I possess strong technical skills in areas like enterprise application architecture, UI design, Java/J2EE, Python, AWS Cloud Technologies, GCP, Kubernetes, CI/CD DevOps, Agile Project Management/Scrum, Machine Learning, Deep Learning, Natural Language Processing, Generative AI, and Langchain Agents.
|
| 2 |
+
|
| 3 |
+
In 2008, I took a spiritual sabbatical, during which I completed my CFA, cleared the UGC-NET exam, and became an Assistant Professor at a b-school. I also worked as a freelance academic content developer. Currently, I am seeking opportunities in the IT industry, specifically in data science, deep learning, and AI. I have upgraded my skills in these areas and hold an AWS Associate Solution Architect certification. I have also completed various Coursera certifications in machine learning, deep learning, NLP, generative AI, Agile project management, Google Cloud Professional, Google Kubernetes, and blockchain. My unique blend of academic excellence, diverse work experience, spiritual growth, and cutting-edge IT skills make me a suitable candidate for challenging roles in deep learning and AI. I am eager to contribute my experience and knowledge to my next role in the IT industry.
|
testing.txt
ADDED
|
@@ -0,0 +1 @@
|
|
|
|
|
|
|
| 1 |
+
Work Profile - Test Management: As Test Manager at the Central Test Facility of Scope International, I was responsible for managing the testing of Standard Chartered Bank's enterprise-level financial applications. This involved test management and testing activities for various projects. For the SSD Data mart Phase 3 project, I oversaw System Integration Testing of 24 banking application systems across 11 countries, including enhancements to the Sales Data mart for analytics purposes. For the Euronet SIT project, I managed System Integration Testing for an ATM Switch Migration from CR2 to Euronet, which involved the Hogan core banking system. Additionally, I performed functional testing of Marcis, a retail lending product, and participated in testing SCB's International Multi-currency payment platform. At Wipro, I served as a Project Manager in Testing Services, where I was involved in State Street–TA EURO UAT regression testing for iFAST Mutual Fund Trade Processing and Washington Mutual functional & system integration testing for their Retail Case Management System. During my time at Wipro, I streamlined the test management process and introduced a Test Run Chart Framework for Trade Life Cycle Testing.
|
work experience.txt
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
|
|
|
| 1 |
+
Work Experience - From May 1994 to May 1996, I worked as a GET/Product Engineer at Larsen & Toubro Ltd. in Mumbai. From June 1999 to December 2000, I worked as a Manager & Head of Credit at Global Trust Bank Ltd. in Madurai. From December 2000 to March 2003, I worked as a Requirement Engineer/Production Consultant at Ramco Systems Ltd. in Chennai. From April 2003 to June 2004, I worked as an Associate Consultant at Infosys in Mangalore. From July 2004 to December 2004, I worked as an Implementation Consultant at Infosys in Bengaluru. From January 2005 to April 2006, I worked as a Manager Presales at Iflex Solutions in Bengaluru. From May 2006 to March 2007, I worked as a Test Manager at PSI Data Systems (Scope International) in Chennai. From April 2007 to June 2007, I worked as a Project Manager at D&B Predictive & Analytic Sciences Pvt Ltd. in Chennai. From July 2007 to June 2008, I worked as a Project Manager at Wipro Technologies Ltd. in Chennai. From December 2011 to March 2012, I worked as an Assistant Professor (Finance) at VIT Business School in Vellore. From May 2012 to the present, I have been engaged in Self-directed Learning, Spiritual Exploration, and Academic Content Development.
|
| 2 |
+
|