The dataset viewer is not available for this split.
Cannot extract the features (columns) for the split 'train' of the config 'default' of the dataset.
Error code:   FeaturesError
Exception:    UnicodeDecodeError
Message:      'utf-8' codec can't decode byte 0xff in position 0: invalid start byte
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 322, in compute
                  compute_first_rows_from_parquet_response(
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 88, in compute_first_rows_from_parquet_response
                  rows_index = indexer.get_rows_index(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 440, in get_rows_index
                  return RowsIndex(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 343, in __init__
                  self.parquet_index = self._init_parquet_index(
                File "/src/libs/libcommon/src/libcommon/parquet_utils.py", line 360, in _init_parquet_index
                  response = get_previous_step_or_raise(
                File "/src/libs/libcommon/src/libcommon/simple_cache.py", line 565, in get_previous_step_or_raise
                  raise CachedArtifactError(
              libcommon.simple_cache.CachedArtifactError: The previous step failed.
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 121, in _generate_tables
                  pa_table = paj.read_json(
                File "pyarrow/_json.pyx", line 308, in pyarrow._json.read_json
                File "pyarrow/error.pxi", line 154, in pyarrow.lib.pyarrow_internal_check_status
                File "pyarrow/error.pxi", line 91, in pyarrow.lib.check_status
              pyarrow.lib.ArrowInvalid: JSON parse error: Invalid value. in row 0
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/split/first_rows.py", line 237, in compute_first_rows_from_streaming_response
                  iterable_dataset = iterable_dataset._resolve_features()
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 2215, in _resolve_features
                  features = _infer_features_from_batch(self.with_format(None)._head())
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1239, in _head
                  return _examples_to_batch(list(self.take(n)))
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1388, in __iter__
                  for key, example in ex_iterable:
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 1044, in __iter__
                  yield from islice(self.ex_iterable, self.n)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/iterable_dataset.py", line 282, in __iter__
                  for key, pa_table in self.generate_tables_fn(**self.kwargs):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/json/json.py", line 144, in _generate_tables
                  dataset = json.load(f)
                File "/usr/local/lib/python3.9/json/__init__.py", line 293, in load
                  return loads(fp.read(),
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/download/streaming_download_manager.py", line 342, in read_with_retries
                  out = read(*args, **kwargs)
                File "/usr/local/lib/python3.9/codecs.py", line 322, in decode
                  (result, consumed) = self._buffer_decode(data, self.errors, final)
              UnicodeDecodeError: 'utf-8' codec can't decode byte 0xff in position 0: invalid start byte

Need help to make the dataset viewer work? Open a discussion for direct support.

Dataset Card for Duskfallcrew Art Style Dataset

Dataset for Duskfallcrew Art Style, aka Kieran Somerville. This artistic style is a self collected, self made dataset.

This dataset card aims to be a base template for new datasets. It has been generated using this raw template.

Lisc Requirements

You have rights to distribute the LORA weights of which you train on this dataset, but you do not own the dataset. You're more than welcome to consistently add it to lora, checkpoint training on any Stable Diffusion Stable Cascade or Pixart models.

Please check the full out of SCOPE for details on prohibited use.

Largely the only thing Earth & Dusk asks is that you do not RESELL the dataset, and do not create print on demand with it.

We realize the art isn't that great, but it's our art, and we wanted to share it.

Dataset Details

Dataset Description

Comic style art by Duskfallcrew of Earth & Dusk

Uses

Combining this in multiple style loras would be wonderful, just note that you don't own the dataset.

Direct Use

[More Information Needed]

Out-of-Scope Use

Modified from: https://freedevproject.org/faipl-1.0-sd/

You may not use this dataset or any derived model for the following:

In any way that violates any applicable national, federal, state, local or international law or regulation;

For the purpose of exploiting, harming or attempting to exploit or harm minors in any way;

To generate or disseminate verifiably false information and/or content with the purpose of harming others;

To generate or disseminate personal identifiable information that can be used to harm an individual;

To defame, disparage or otherwise harass others;

For fully automated decision making that adversely impacts an individual’s legal rights or otherwise creates or modifies a binding, enforceable obligation;

For any use intended to or which has the effect of discriminating against or harming individuals or groups based on online or offline social behavior or known or predicted personal or personality characteristics;

To exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm;

For any use intended to or which has the effect of discriminating against individuals or groups based on legally protected characteristics or categories;

To provide medical advice and medical results interpretation;

To generate or disseminate information for the purpose to be used for administration of justice, law enforcement, immigration or asylum processes, such as predicting an individual will commit fraud/crime commitment (e.g. by text profiling, drawing causal relationships between assertions made in documents, indiscriminate and arbitrarily-targeted use).

No Harm

You agree that no contributor’s conduct in the creation of this dataset has caused you any harm. As far as the law allows, you give up your right to pursue any kind of legal claim against any contributor for actions related the creation of this software, even if those actions broke a previous agreement.

Additionally, you agree not to use this dataset for harmful purposes, as listed in Prohibited Uses. These restrictions do not apply to non-model parts of this software.

No Liability

As far as the law allows, this software comes as is, without any warranty or condition, and no contributor will be liable to anyone for any damages related to this dataset or this license, under any kind of legal claim.

Dataset Card Contact

For queries about copyright and liscencing of the dataset : https://www.end-media.org

Downloads last month
0
Edit dataset card