repo_name
stringclasses 28
values | pr_number
int64 1.86k
122k
| pr_title
stringlengths 5
204
| author
stringlengths 3
58
| git_commit_prev
stringlengths 40
40
| git_commit_curr
stringlengths 40
40
| date_created
stringlengths 25
25
| date_merged
stringlengths 25
25
| query
stringlengths 12
65.6k
| context_file_path
stringlengths 6
233
| label
int64 -1
1
| language
stringclasses 5
values |
---|---|---|---|---|---|---|---|---|---|---|---|
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/torch/nn.py | 1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/ops/nn_test.py | 1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/numpy/nn.py | 1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/tensorflow/nn.py | 1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/jax/nn.py | 1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/callbacks/lambda_callback_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/models/variable_mapping_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/metrics/iou_metrics.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/rnn/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/image_captioning.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/tensorflow/core.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/models/cloning_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./benchmarks/layer_benchmark/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/audio/speaker_recognition_using_cnn.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/preprocessing/random_zoom.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/ops/operation_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/demo_jax_distributed.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/callbacks/history.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/utils/audio_dataset_utils.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/generative/lstm_character_level_text_generation.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/activations/activation.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/preprocessing/rescaling.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/integrated_gradients.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/initializers/random_initializers_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/optimizers/base_optimizer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/callbacks/early_stopping_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/utils/text_dataset_utils_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/legacy/saving/saving_utils.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./integration_tests/import_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/structured_data/structured_data_classification_from_scratch.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/saving/saving_api.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/torch/optimizers/torch_adadelta.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/utils/file_utils.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/convolutional/separable_conv_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/reshaping/permute_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/applications/xception.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/normalization/layer_normalization.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/attention/additive_attention_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/datasets/cifar100.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/nlp/neural_machine_translation_with_keras_nlp.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/mobilevit.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/reshaping/up_sampling2d.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/testing/test_utils_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/shiftvit.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/perceiver_image_classification.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/normalization/batch_normalization.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/rnn/conv_lstm2d_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/pooling/global_max_pooling1d.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./guides/functional_api.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/trainers/data_adapters/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/activations/leaky_relu_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/vision/knowledge_distillation.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/normalization/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/convolutional/depthwise_conv2d.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/layer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./benchmarks/model_benchmark/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/demo_custom_jax_workflow.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/vision/object_detection_using_vision_transformer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/regularization/spatial_dropout.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/keras_recipes/trainer_pattern.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/reshaping/cropping2d_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/preprocessing/hashed_crossing_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./benchmarks/torch_ctl_benchmark/conv_model_benchmark.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/utils/code_stats_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/timeseries/timeseries_traffic_forecasting.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./guides/writing_your_own_callbacks.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/tensorflow/tensorboard.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/nlp/ner_transformers.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/ops/core_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/reshaping/up_sampling3d.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./benchmarks/torch_ctl_benchmark/dense_model_benchmark.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/merging/concatenate.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/common/compute_output_spec_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/vision/mnist_convnet.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/legacy/saving/json_utils_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/jax/layer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/torch/__init__.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./shell/format.sh | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/keras_recipes/endpoint_layer_pattern.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/vision/mlp_image_classification.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/normalization/spectral_normalization.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/utils/dataset_utils_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/trainers/trainer_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./kokoro/github/ubuntu/gpu/jax/continuous.cfg | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/backend/jax/trainer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/preprocessing/integer_lookup_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./setup.cfg | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/merging/merging_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/merging/maximum.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/layers/convolutional/conv_transpose_test.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/swim_transformers.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./benchmarks/torch_ctl_benchmark/benchmark_utils.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/tensorflow/vision/semantic_image_clustering.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./examples/keras_io/nlp/neural_machine_translation_with_transformer.py | -1 | python |
keras-team/keras | 18,876 | Ensure dtype consistency for activation functions | james77777778 | ec49bc1be737cd4170093f44fb7b76251a3a35dd | 92f4d17d9fa1553504dc132d3e8a2fb6d0077551 | 2023-12-03 13:03:58+00:00 | 2023-12-04 17:51:37+00:00 | Ensure dtype consistency for activation functions. This PR includes the following:
1. Replace `hard_swish` with `hard_silu` to follow the naming convention of silu
2. Export `hard_silu` as `hard_swish`, following the pattern of `silu` and `swish`
3. ~Add `tanh` activation (it is currently missing in the codebase)~ (covered by `ops.numpy.tanh`)
4. Add dtype tests for all existing activation functions
| ./keras/losses/loss_test.py | -1 | python |
keras-team/keras | 18,860 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid` | james77777778 | 4d362cea4236637746d373623b89be37dce9660e | f0b7062e4c6a62c521af491b09d97f009b1add0b | 2023-12-01 02:02:15+00:00 | 2023-12-01 05:07:34+00:00 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid`. This PR:
- add `hard_swish` to `ops.nn` to ensure consistency for all activation functions in the codebase
- add dtype inference tests for `hard_sigmoid` and `hard_swish`
I plan to add more dtype inference tests for `ops.nn`. I think we can focus on float types for these ops, as it's uncommon to use them with integer types.
Is it a good idea (adding more tests / skipping integer types for `ops.nn`)? | ./keras/ops/nn.py | 1 | python |
keras-team/keras | 18,860 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid` | james77777778 | 4d362cea4236637746d373623b89be37dce9660e | f0b7062e4c6a62c521af491b09d97f009b1add0b | 2023-12-01 02:02:15+00:00 | 2023-12-01 05:07:34+00:00 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid`. This PR:
- add `hard_swish` to `ops.nn` to ensure consistency for all activation functions in the codebase
- add dtype inference tests for `hard_sigmoid` and `hard_swish`
I plan to add more dtype inference tests for `ops.nn`. I think we can focus on float types for these ops, as it's uncommon to use them with integer types.
Is it a good idea (adding more tests / skipping integer types for `ops.nn`)? | ./keras/activations/activations.py | 1 | python |
keras-team/keras | 18,860 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid` | james77777778 | 4d362cea4236637746d373623b89be37dce9660e | f0b7062e4c6a62c521af491b09d97f009b1add0b | 2023-12-01 02:02:15+00:00 | 2023-12-01 05:07:34+00:00 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid`. This PR:
- add `hard_swish` to `ops.nn` to ensure consistency for all activation functions in the codebase
- add dtype inference tests for `hard_sigmoid` and `hard_swish`
I plan to add more dtype inference tests for `ops.nn`. I think we can focus on float types for these ops, as it's uncommon to use them with integer types.
Is it a good idea (adding more tests / skipping integer types for `ops.nn`)? | ./keras/backend/torch/nn.py | 1 | python |
keras-team/keras | 18,860 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid` | james77777778 | 4d362cea4236637746d373623b89be37dce9660e | f0b7062e4c6a62c521af491b09d97f009b1add0b | 2023-12-01 02:02:15+00:00 | 2023-12-01 05:07:34+00:00 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid`. This PR:
- add `hard_swish` to `ops.nn` to ensure consistency for all activation functions in the codebase
- add dtype inference tests for `hard_sigmoid` and `hard_swish`
I plan to add more dtype inference tests for `ops.nn`. I think we can focus on float types for these ops, as it's uncommon to use them with integer types.
Is it a good idea (adding more tests / skipping integer types for `ops.nn`)? | ./keras/ops/nn_test.py | 1 | python |
keras-team/keras | 18,860 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid` | james77777778 | 4d362cea4236637746d373623b89be37dce9660e | f0b7062e4c6a62c521af491b09d97f009b1add0b | 2023-12-01 02:02:15+00:00 | 2023-12-01 05:07:34+00:00 | Add `hard_swish` to `ops.nn` and dtype tests for `hard_swish` and `hard_sigmoid`. This PR:
- add `hard_swish` to `ops.nn` to ensure consistency for all activation functions in the codebase
- add dtype inference tests for `hard_sigmoid` and `hard_swish`
I plan to add more dtype inference tests for `ops.nn`. I think we can focus on float types for these ops, as it's uncommon to use them with integer types.
Is it a good idea (adding more tests / skipping integer types for `ops.nn`)? | ./keras/backend/numpy/nn.py | 1 | python |