The dataset viewer is not available for this split.
Cannot load the dataset split (in streaming mode) to extract the first rows.
Error code:   StreamingRowsError
Exception:    FileNotFoundError
Message:      http://oma-project.com/ArSenL/ArSenTD-LEV.zip
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 414, in _info
                  await _file_info(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 849, in _file_info
                  r.raise_for_status()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/aiohttp/client_reqrep.py", line 1060, in raise_for_status
                  raise ClientResponseError(
              aiohttp.client_exceptions.ClientResponseError: 404, message='Not Found', url=URL('http://oma-project.com/ArSenL/ArSenTD-LEV.zip')
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/utils.py", line 126, in get_rows_or_raise
                  return get_rows(
                File "/src/services/worker/src/worker/utils.py", line 64, in decorator
                  return func(*args, **kwargs)
                File "/src/services/worker/src/worker/utils.py", line 103, in get_rows
                  rows_plus_one = list(itertools.islice(ds, rows_max_number + 1))
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1388, in __iter__
                  for key, example in ex_iterable:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 234, in __iter__
                  yield from self.generate_examples_fn(**self.kwargs)
                File "/tmp/modules-cache/datasets_modules/datasets/arsentd_lev/baefb80123b3164550ac5d941ce9c1a3f41e92f1b525a71f8f29e0869f172d7a/arsentd_lev.py", line 79, in _generate_examples
                  with open(path, encoding="utf-8") as f:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/streaming.py", line 75, in wrapper
                  return function(*args, download_config=download_config, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 512, in xopen
                  file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 452, in open
                  out = open_files(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 280, in open_files
                  fs, fs_token, paths = get_fs_token_paths(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 622, in get_fs_token_paths
                  fs = filesystem(protocol, **inkwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/registry.py", line 290, in filesystem
                  return cls(**storage_options)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 79, in __call__
                  obj = super().__call__(*args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/zip.py", line 56, in __init__
                  self.fo = fo.__enter__()  # the whole instance is a context
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 100, in __enter__
                  f = self.fs.open(self.path, mode=mode)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1307, in open
                  f = self._open(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 353, in _open
                  size = size or self.info(path, **kwargs)["size"]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 118, in wrapper
                  return sync(self.loop, func, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 103, in sync
                  raise return_result
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/asyn.py", line 56, in _runner
                  result[0] = await coro
                File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/implementations/http.py", line 427, in _info
                  raise FileNotFoundError(url) from exc
              FileNotFoundError: http://oma-project.com/ArSenL/ArSenTD-LEV.zip

Need help to make the dataset viewer work? Open a discussion for direct support.

Dataset Card for ArSenTD-LEV

Dataset Summary

The Arabic Sentiment Twitter Dataset for Levantine dialect (ArSenTD-LEV) contains 4,000 tweets written in Arabic and equally retrieved from Jordan, Lebanon, Palestine and Syria.

Supported Tasks and Leaderboards

Sentriment analysis

Languages

Arabic Levantine Dualect

Dataset Structure

Data Instances

{'Country': 0, 'Sentiment': 3, 'Sentiment_Expression': 0, 'Sentiment_Target': 'هاي سوالف عصابات ارهابية', 'Topic': 'politics', 'Tweet': 'ثلاث تفجيرات في #كركوك الحصيلة قتيل و 16 جريح بدأت اكلاوات كركوك كانت امان قبل دخول القوات العراقية ، هاي سوالف عصابات ارهابية'}

Data Fields

Tweet: the text content of the tweet
Country: the country from which the tweet was collected ('jordan', 'lebanon', 'syria', 'palestine')
Topic: the topic being discussed in the tweet (personal, politics, religion, sports, entertainment and others)
Sentiment: the overall sentiment expressed in the tweet (very_negative, negative, neutral, positive and very_positive)
Sentiment_Expression: the way how the sentiment was expressed: explicit, implicit, or none (the latter when sentiment is neutral)
Sentiment_Target: the segment from the tweet to which sentiment is expressed. If sentiment is neutral, this field takes the 'none' value.

Data Splits

No standard splits are provided

Dataset Creation

Curation Rationale

[More Information Needed]

Source Data

Initial Data Collection and Normalization

[More Information Needed]

Who are the source language producers?

[More Information Needed]

Annotations

Annotation process

[More Information Needed]

Who are the annotators?

[More Information Needed]

Personal and Sensitive Information

[More Information Needed]

Considerations for Using the Data

Social Impact of Dataset

[More Information Needed]

Discussion of Biases

[More Information Needed]

Other Known Limitations

[More Information Needed]

Additional Information

Dataset Curators

[More Information Needed]

Licensing Information

Make sure to read and agree to the license

Citation Information

@article{baly2019arsentd,
  title={Arsentd-lev: A multi-topic corpus for target-based sentiment analysis in arabic levantine tweets},
  author={Baly, Ramy and Khaddaj, Alaa and Hajj, Hazem and El-Hajj, Wassim and Shaban, Khaled Bashir},
  journal={arXiv preprint arXiv:1906.01830},
  year={2019}
}

Contributions

Thanks to @moussaKam for adding this dataset.

Downloads last month
268