Different output using deoping on Sagemaker

#3
by maxreis86 - opened

Hi guys,

This model is amazing. Congratulations and thanks for sharing \o/

I have deployed the model with the option "Use in Transformers" using the code below and I got the same output as presented in model card:

from transformers import AutoTokenizer, AutoModelForTokenClassification
tokenizer = AutoTokenizer.from_pretrained("Jean-Baptiste/roberta-large-ner-english")
model = AutoModelForTokenClassification.from_pretrained("Jean-Baptiste/roberta-large-ner-english")

##### Process text sample (from wikipedia)

from transformers import pipeline
nlp = pipeline('ner', model=model, tokenizer=tokenizer, aggregation_strategy="simple")
nlp("Apple was founded in 1976 by Steve Jobs, Steve Wozniak and Ronald Wayne to develop and sell Wozniak's Apple I personal computer")

[{'entity_group': 'ORG',
  'score': 0.99381506,
  'word': ' Apple',
  'start': 0,
  'end': 5},
 {'entity_group': 'PER',
  'score': 0.99970853,
  'word': ' Steve Jobs',
  'start': 29,
  'end': 39},
 {'entity_group': 'PER',
  'score': 0.99981767,
  'word': ' Steve Wozniak',
  'start': 41,
  'end': 54},
 {'entity_group': 'PER',
  'score': 0.99956465,
  'word': ' Ronald Wayne',
  'start': 59,
  'end': 71},
 {'entity_group': 'PER',
  'score': 0.9997918,
  'word': ' Wozniak',
  'start': 92,
  'end': 99},
 {'entity_group': 'MISC',
  'score': 0.99956393,
  'word': ' Apple I',
  'start': 102,
  'end': 109}]

However, When deploying with the Amazon SageMaker option I got a different output as shown bellow:

from sagemaker.huggingface import HuggingFaceModel
import sagemaker

role = sagemaker.get_execution_role()
# Hub Model configuration. https://huggingface.co/models
hub = {
    'HF_MODEL_ID':'Jean-Baptiste/roberta-large-ner-english',
    'HF_TASK':'token-classification'
}

# create Hugging Face Model Class
huggingface_model = HuggingFaceModel(
    transformers_version='4.17.0',
    pytorch_version='1.10.2',
    py_version='py38',
    env=hub,
    role=role, 
)

# deploy model to SageMaker Inference
predictor = huggingface_model.deploy(
    initial_instance_count=1, # number of instances
    instance_type='ml.m5.xlarge' # ec2 instance type
)

predictor.predict({
    'inputs': "Apple was founded in 1976 by Steve Jobs, Steve Wozniak and Ronald Wayne to develop and sell Wozniak's Apple I personal computer"
})

[{'entity': 'ORG',
  'score': 0.9938150644302368,
  'index': 1,
  'word': 'ĠApple',
  'start': 0,
  'end': 5},
 {'entity': 'PER',
  'score': 0.9996582269668579,
  'index': 7,
  'word': 'ĠSteve',
  'start': 29,
  'end': 34},
 {'entity': 'PER',
  'score': 0.999758780002594,
  'index': 8,
  'word': 'ĠJobs',
  'start': 35,
  'end': 39},
 {'entity': 'PER',
  'score': 0.9997087121009827,
  'index': 10,
  'word': 'ĠSteve',
  'start': 41,
  'end': 46},
 {'entity': 'PER',
  'score': 0.9998393058776855,
  'index': 11,
  'word': 'ĠW',
  'start': 47,
  'end': 48},
 {'entity': 'PER',
  'score': 0.9998897314071655,
  'index': 12,
  'word': 'oz',
  'start': 48,
  'end': 50},
 {'entity': 'PER',
  'score': 0.9998410940170288,
  'index': 13,
  'word': 'ni',
  'start': 50,
  'end': 52},
 {'entity': 'PER',
  'score': 0.9998093247413635,
  'index': 14,
  'word': 'ak',
  'start': 52,
  'end': 54},
 {'entity': 'PER',
  'score': 0.9995868802070618,
  'index': 16,
  'word': 'ĠRonald',
  'start': 59,
  'end': 65},
 {'entity': 'PER',
  'score': 0.9995424747467041,
  'index': 17,
  'word': 'ĠWayne',
  'start': 66,
  'end': 71},
 {'entity': 'PER',
  'score': 0.9997656941413879,
  'index': 22,
  'word': 'ĠW',
  'start': 92,
  'end': 93},
 {'entity': 'PER',
  'score': 0.999869704246521,
  'index': 23,
  'word': 'oz',
  'start': 93,
  'end': 95},
 {'entity': 'PER',
  'score': 0.9997791647911072,
  'index': 24,
  'word': 'ni',
  'start': 95,
  'end': 97},
 {'entity': 'PER',
  'score': 0.9997527003288269,
  'index': 25,
  'word': 'ak',
  'start': 97,
  'end': 99},
 {'entity': 'MISC',
  'score': 0.9996131062507629,
  'index': 27,
  'word': 'ĠApple',
  'start': 102,
  'end': 107},
 {'entity': 'MISC',
  'score': 0.9995146989822388,
  'index': 28,
  'word': 'ĠI',
  'start': 108,
  'end': 109}]

What do I need to change on SageMaker to get the same amazing result please?

Hello Maxuel,

I am the one who trained this model. Sorry but I cannot help you, I have never used sagemaker myself.
I did a quick search and see someone else who fixed his issue by updating the transformer library to more recent version. Maybe you can try that?
Otherwise maybe try asking your question in the forum of huggingface, I saw a dedicated one for SageMaker.

Good luck!
Jean-Baptiste

Checking to see if anyone was able to find the solution for this. I am running in to same issue and would love to get some recommendations.

Hello DeepakSparks,
As I could deploy it using AWS Lambda and it is working perfectly, I haven't delved into the SageMaker issue and I don't know why this is happening.

Thanks let me check this out

Sign up or log in to comment