Spaces:
Running
on
Zero
Running
on
Zero
Fix error
Browse files- cache_system.py +7 -0
cache_system.py
CHANGED
@@ -10,6 +10,13 @@ class CacheHandler:
|
|
10 |
self.cache["https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/"] = {
|
11 |
"title": "Iker García-Ferrero | Personal Webpage",
|
12 |
"date": datetime.now(),
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
13 |
"summary_0": "Iker García-Ferrero es un candidato a PhD en Natural Language Processing (NLP) "
|
14 |
"en la Universidad del País Vasco UPV/EHU, IXA Group y HiTZ Centro Vasco de Tecnología de la "
|
15 |
"Lengua, financiado por una beca del Gobierno Vasco. "
|
|
|
10 |
self.cache["https://ikergarcia1996.github.io/Iker-Garcia-Ferrero/"] = {
|
11 |
"title": "Iker García-Ferrero | Personal Webpage",
|
12 |
"date": datetime.now(),
|
13 |
+
"text": """I am currently a PhD candidate specializing in Natural Language Processing (NLP) at the University of the Basque Country UPV/EHU, IXA Group, and HiTZ Basque Center for Language Technologies, funded by a grant from the Basque Government. My advisors are German Rigau and Rodrigo Agerri. I anticipate concluding my PhD by early 2024.
|
14 |
+
|
15 |
+
My previous experiences include an internship as an Applied Scientist at Amazon Barcelona, where I was part of Lluis Marquez's team. I also served as Visiting Associate for 4 months at the School of Engineering and Applied Science, Department of Computer and Information Science, Cognitive Computation Group at the University of Pennsylvania under the supervision of Dan Roth.
|
16 |
+
|
17 |
+
My research primarily focuses on Multilingual Natural Language Processing. I aim to develop deep learning models and resources that enable NLP in languages with limited or no available resources. This research branches in two directions. First, data-transfer methods for which I have developed state-of-the-art techniques to automatically generate annotated data for languages that lack these resources. Second, model-transfer methods, a field in which I've made significant contributions to improve the zero-shot cross-lingual performance of NLP models. Recently, my research has branched into training Large Language Models (LLMs) for various tasks and domains. The most notable ones being GoLLIE a 34B parameter LLM which achieves state-of-the-art results for zero-shot Information Extraction, and MedMT5, the first open-source text-to-text multilingual model for the medical domain.
|
18 |
+
|
19 |
+
""",
|
20 |
"summary_0": "Iker García-Ferrero es un candidato a PhD en Natural Language Processing (NLP) "
|
21 |
"en la Universidad del País Vasco UPV/EHU, IXA Group y HiTZ Centro Vasco de Tecnología de la "
|
22 |
"Lengua, financiado por una beca del Gobierno Vasco. "
|